Mar 18 15:37:14 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 15:37:14 crc restorecon[4755]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:14 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:37:15 crc restorecon[4755]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 15:37:15 crc kubenswrapper[4939]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.835476 4939 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846317 4939 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846374 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846386 4939 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846395 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846406 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846415 4939 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846424 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846432 4939 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846441 4939 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846448 4939 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846456 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846466 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846474 4939 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846482 4939 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846490 4939 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846498 4939 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846537 4939 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846545 4939 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846553 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846561 4939 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846569 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846577 4939 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846584 4939 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846592 4939 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846600 4939 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846607 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846616 4939 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846623 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846631 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846639 4939 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846646 4939 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846656 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846664 4939 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846672 4939 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846680 4939 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846688 4939 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846696 4939 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846704 4939 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846713 4939 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846721 4939 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846732 4939 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846739 4939 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846747 4939 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846754 4939 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846765 4939 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846776 4939 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846785 4939 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846797 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846806 4939 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846814 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846824 4939 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846834 4939 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846843 4939 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846852 4939 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846862 4939 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846872 4939 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846881 4939 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846891 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846900 4939 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846908 4939 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846916 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846924 4939 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846932 4939 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846940 4939 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846947 4939 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846955 4939 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846963 4939 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846971 4939 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846979 4939 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.846988 4939 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.847019 4939 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847192 4939 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847213 4939 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847230 4939 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847245 4939 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847257 4939 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847266 4939 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847279 4939 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847291 4939 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847301 4939 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847310 4939 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847319 4939 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847330 4939 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847340 4939 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847349 4939 flags.go:64] FLAG: --cgroup-root="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847360 4939 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847371 4939 flags.go:64] FLAG: --client-ca-file="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847383 4939 flags.go:64] FLAG: --cloud-config="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847394 4939 flags.go:64] FLAG: --cloud-provider="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847405 4939 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847418 4939 flags.go:64] FLAG: --cluster-domain="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847427 4939 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847436 4939 flags.go:64] FLAG: --config-dir="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847445 4939 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847455 4939 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847468 4939 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847478 4939 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847487 4939 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847496 4939 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847531 4939 flags.go:64] FLAG: --contention-profiling="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847540 4939 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847549 4939 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847559 4939 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847568 4939 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847581 4939 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847591 4939 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847600 4939 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847609 4939 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847618 4939 flags.go:64] FLAG: --enable-server="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847627 4939 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847640 4939 flags.go:64] FLAG: --event-burst="100" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847649 4939 flags.go:64] FLAG: --event-qps="50" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847658 4939 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847668 4939 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847677 4939 flags.go:64] FLAG: --eviction-hard="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847689 4939 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847699 4939 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847708 4939 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847718 4939 flags.go:64] FLAG: --eviction-soft="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847727 4939 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847736 4939 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847747 4939 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847758 4939 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847801 4939 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847813 4939 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847824 4939 flags.go:64] FLAG: --feature-gates="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847839 4939 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847850 4939 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847862 4939 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847875 4939 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847886 4939 flags.go:64] FLAG: --healthz-port="10248" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847898 4939 flags.go:64] FLAG: --help="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847909 4939 flags.go:64] FLAG: --hostname-override="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847919 4939 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847931 4939 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847942 4939 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847957 4939 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847968 4939 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847980 4939 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847989 4939 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.847998 4939 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848008 4939 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848018 4939 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848028 4939 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848037 4939 flags.go:64] FLAG: --kube-reserved="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848046 4939 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848056 4939 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848065 4939 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848074 4939 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848083 4939 flags.go:64] FLAG: --lock-file="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848092 4939 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848102 4939 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848111 4939 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848138 4939 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848149 4939 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848158 4939 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848167 4939 flags.go:64] FLAG: --logging-format="text" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848177 4939 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848190 4939 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848201 4939 flags.go:64] FLAG: --manifest-url="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848213 4939 flags.go:64] FLAG: --manifest-url-header="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848229 4939 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848240 4939 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848256 4939 flags.go:64] FLAG: --max-pods="110" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848267 4939 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848279 4939 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848290 4939 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848301 4939 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848317 4939 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848329 4939 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848342 4939 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848371 4939 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848383 4939 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848395 4939 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848407 4939 flags.go:64] FLAG: --pod-cidr="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848418 4939 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848437 4939 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848449 4939 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848461 4939 flags.go:64] FLAG: --pods-per-core="0" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848472 4939 flags.go:64] FLAG: --port="10250" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848485 4939 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848496 4939 flags.go:64] FLAG: --provider-id="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848547 4939 flags.go:64] FLAG: --qos-reserved="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848559 4939 flags.go:64] FLAG: --read-only-port="10255" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848571 4939 flags.go:64] FLAG: --register-node="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848582 4939 flags.go:64] FLAG: --register-schedulable="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848594 4939 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848616 4939 flags.go:64] FLAG: --registry-burst="10" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848628 4939 flags.go:64] FLAG: --registry-qps="5" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848641 4939 flags.go:64] FLAG: --reserved-cpus="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848655 4939 flags.go:64] FLAG: --reserved-memory="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848671 4939 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848683 4939 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848695 4939 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848707 4939 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848718 4939 flags.go:64] FLAG: --runonce="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848728 4939 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848737 4939 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848747 4939 flags.go:64] FLAG: --seccomp-default="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848756 4939 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848765 4939 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848776 4939 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848785 4939 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848796 4939 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848805 4939 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848814 4939 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848824 4939 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848833 4939 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848843 4939 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848853 4939 flags.go:64] FLAG: --system-cgroups="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848862 4939 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848877 4939 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848885 4939 flags.go:64] FLAG: --tls-cert-file="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848894 4939 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848906 4939 flags.go:64] FLAG: --tls-min-version="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848915 4939 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848924 4939 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848933 4939 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848942 4939 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848951 4939 flags.go:64] FLAG: --v="2" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848964 4939 flags.go:64] FLAG: --version="false" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848976 4939 flags.go:64] FLAG: --vmodule="" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848988 4939 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.848997 4939 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849222 4939 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849231 4939 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849255 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849264 4939 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849273 4939 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849281 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849290 4939 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849299 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849316 4939 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849324 4939 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849332 4939 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849340 4939 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849348 4939 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849356 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849364 4939 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849372 4939 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849380 4939 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849388 4939 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849396 4939 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849404 4939 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849412 4939 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849423 4939 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849443 4939 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849452 4939 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849462 4939 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849471 4939 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849481 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849490 4939 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849531 4939 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849542 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849550 4939 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849560 4939 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849568 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849576 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849584 4939 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849592 4939 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849600 4939 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849608 4939 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849618 4939 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849625 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849637 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849644 4939 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849653 4939 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849660 4939 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849669 4939 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849676 4939 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849684 4939 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849692 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849700 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849708 4939 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849716 4939 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849724 4939 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849734 4939 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849742 4939 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849754 4939 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849761 4939 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849769 4939 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849777 4939 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849785 4939 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849793 4939 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849801 4939 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849809 4939 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849817 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849825 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849834 4939 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849841 4939 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849850 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849857 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849866 4939 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849873 4939 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.849881 4939 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.851007 4939 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.863018 4939 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.863481 4939 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863592 4939 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863601 4939 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863606 4939 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863610 4939 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863615 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863623 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863626 4939 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863630 4939 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863660 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863664 4939 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863668 4939 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863672 4939 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863676 4939 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863679 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863684 4939 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863689 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863695 4939 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863701 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863705 4939 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863710 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863715 4939 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863720 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863724 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863730 4939 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863734 4939 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863739 4939 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863743 4939 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863748 4939 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863752 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863758 4939 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863761 4939 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863767 4939 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863771 4939 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863775 4939 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863780 4939 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863788 4939 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863794 4939 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863801 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863806 4939 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863811 4939 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863814 4939 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863819 4939 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863822 4939 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863826 4939 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863830 4939 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863834 4939 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863838 4939 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863842 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863846 4939 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863850 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863854 4939 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863858 4939 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863862 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863868 4939 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863872 4939 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863875 4939 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863879 4939 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863883 4939 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863888 4939 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863893 4939 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863898 4939 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863903 4939 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863909 4939 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863915 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863920 4939 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863925 4939 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863929 4939 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863933 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863937 4939 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863941 4939 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.863945 4939 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.863953 4939 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864093 4939 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864099 4939 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864104 4939 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864108 4939 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864112 4939 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864117 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864121 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864126 4939 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864131 4939 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864136 4939 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864140 4939 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864144 4939 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864148 4939 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864153 4939 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864157 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864161 4939 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864165 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864169 4939 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864173 4939 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864177 4939 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864181 4939 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864185 4939 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864188 4939 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864192 4939 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864196 4939 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864201 4939 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864205 4939 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864209 4939 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864213 4939 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864218 4939 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864223 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864227 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864231 4939 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864235 4939 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864239 4939 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864243 4939 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864247 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864251 4939 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864255 4939 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864259 4939 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864262 4939 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864268 4939 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864273 4939 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864278 4939 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864282 4939 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864286 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864290 4939 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864294 4939 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864298 4939 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864302 4939 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864306 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864310 4939 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864314 4939 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864318 4939 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864322 4939 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864326 4939 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864330 4939 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864334 4939 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864338 4939 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864341 4939 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864345 4939 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864349 4939 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864353 4939 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864357 4939 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864361 4939 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864366 4939 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864369 4939 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864375 4939 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864379 4939 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864385 4939 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:37:15 crc kubenswrapper[4939]: W0318 15:37:15.864390 4939 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.864397 4939 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.864614 4939 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 15:37:15 crc kubenswrapper[4939]: E0318 15:37:15.868343 4939 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.874936 4939 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.875163 4939 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.877577 4939 server.go:997] "Starting client certificate rotation" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.877638 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.877826 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.904660 4939 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.906901 4939 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:37:15 crc kubenswrapper[4939]: E0318 15:37:15.908793 4939 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.925087 4939 log.go:25] "Validated CRI v1 runtime API" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.970848 4939 log.go:25] "Validated CRI v1 image API" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.975316 4939 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.983651 4939 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-15-31-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 15:37:15 crc kubenswrapper[4939]: I0318 15:37:15.983711 4939 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.011849 4939 manager.go:217] Machine: {Timestamp:2026-03-18 15:37:16.008035612 +0000 UTC m=+0.607223303 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e4c08689-884e-465b-8c84-d257e1c69929 BootID:63560ce4-57c9-4ea0-827a-ea8b1db6e8ed Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4e:45:aa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4e:45:aa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:fc:cc:82 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:44:7b:af Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ae:47:2b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:94:84:95 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:d7:ef:4e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:83:3a:e2:a3:17 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:b9:d1:64:7f:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.012275 4939 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.012458 4939 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.015720 4939 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.016136 4939 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.016192 4939 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.016535 4939 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.016553 4939 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.017207 4939 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.017255 4939 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.017496 4939 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.017666 4939 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.027050 4939 kubelet.go:418] "Attempting to sync node with API server" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.027083 4939 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.027121 4939 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.027142 4939 kubelet.go:324] "Adding apiserver pod source" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.027161 4939 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.033713 4939 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.035642 4939 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.036284 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.036298 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.036433 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.036451 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.039010 4939 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040635 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040676 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040691 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040703 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040726 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040739 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040752 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040773 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040790 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040814 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040831 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.040845 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.043108 4939 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.043764 4939 server.go:1280] "Started kubelet" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.045464 4939 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.046051 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.045752 4939 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 15:37:16 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.053819 4939 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.056778 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.056871 4939 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.057307 4939 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.057319 4939 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.057397 4939 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.057742 4939 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.060296 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.060431 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.062342 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="200ms" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.062345 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.068383 4939 factory.go:55] Registering systemd factory Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.068446 4939 factory.go:221] Registration of the systemd container factory successfully Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.068643 4939 server.go:460] "Adding debug handlers to kubelet server" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.069262 4939 factory.go:153] Registering CRI-O factory Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.069317 4939 factory.go:221] Registration of the crio container factory successfully Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.070078 4939 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.070125 4939 factory.go:103] Registering Raw factory Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.070146 4939 manager.go:1196] Started watching for new ooms in manager Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.070980 4939 manager.go:319] Starting recovery of all containers Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076629 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076712 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076734 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076754 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076799 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076816 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076836 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076855 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076876 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076897 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076924 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076953 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.076980 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.077007 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.077024 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.077080 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081754 4939 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081853 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081911 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081954 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081974 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.081990 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082009 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082028 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082046 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082059 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082075 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082094 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082112 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082127 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082141 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082156 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082173 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082187 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082202 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082218 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082234 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082248 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082262 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082278 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082292 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082307 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082322 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082337 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082352 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082396 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082411 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082425 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082439 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082452 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082466 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082484 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082546 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082573 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082595 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082617 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082639 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082662 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082682 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082698 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082713 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082727 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082744 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082757 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082775 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082794 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082816 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082835 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082855 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082876 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082894 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082912 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082932 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082954 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082973 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.082989 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083007 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083027 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083045 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083063 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083081 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083101 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083120 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083138 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083152 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083165 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083180 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083195 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083208 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083223 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083238 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083251 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083265 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083277 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083290 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083302 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083314 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083328 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083340 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083353 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083365 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083379 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083392 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083407 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083421 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083441 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083456 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083470 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083486 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083524 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083539 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083556 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083569 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083584 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083600 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083614 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083629 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083646 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083659 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083672 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083686 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083698 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083726 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083739 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083752 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083765 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083786 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083800 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083814 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083828 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083844 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083859 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083872 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083885 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083898 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083911 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083923 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083935 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083952 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083967 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083979 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.083998 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084009 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084027 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084040 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084055 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084072 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084088 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084106 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084121 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084136 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084204 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084219 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084230 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084244 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084259 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084270 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084285 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084298 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084311 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084323 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084336 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084349 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084362 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084375 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084387 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084402 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084416 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084429 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084443 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.084457 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085042 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085067 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085080 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085097 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085121 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085136 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085150 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085162 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085176 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085189 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085201 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085212 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085223 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085235 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085246 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085259 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085271 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085282 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085294 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085309 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085320 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085333 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085345 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085360 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085374 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085385 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085397 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085410 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085422 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085434 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085445 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085458 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085471 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085483 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085495 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085529 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085545 4939 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085557 4939 reconstruct.go:97] "Volume reconstruction finished" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.085566 4939 reconciler.go:26] "Reconciler: start to sync state" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.088985 4939 manager.go:324] Recovery completed Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.105497 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.107634 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.107702 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.107718 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.108876 4939 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.108904 4939 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.108935 4939 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.129223 4939 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.131772 4939 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.131850 4939 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.131893 4939 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.131979 4939 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.132050 4939 policy_none.go:49] "None policy: Start" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.136341 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.136419 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.136444 4939 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.136497 4939 state_mem.go:35] "Initializing new in-memory state store" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.157825 4939 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.209405 4939 manager.go:334] "Starting Device Plugin manager" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.209587 4939 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.209619 4939 server.go:79] "Starting device plugin registration server" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.210410 4939 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.210446 4939 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.210739 4939 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.210919 4939 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.210940 4939 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.225938 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.232256 4939 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.232369 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.233815 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.233882 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.233901 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.234082 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.234562 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.234643 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235103 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235141 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235155 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235238 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235421 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.235468 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236159 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236190 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236203 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236292 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236430 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236523 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236846 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236905 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236925 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.236869 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237037 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237059 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237140 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237175 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237188 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237357 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237529 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237579 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237737 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237788 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.237806 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.238218 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.238273 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.238298 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.238664 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.238725 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.239156 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.239206 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.239225 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.240063 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.240111 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.240128 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.264276 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="400ms" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.287914 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.287952 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.287974 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.287995 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288126 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288196 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288230 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288261 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288296 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288333 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288349 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288409 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288473 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.288559 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.306097 4939 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective": open /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective: no such device Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.313665 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.316476 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.316554 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.316567 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.316629 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.317336 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390107 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390200 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390236 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390252 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390266 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390282 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390298 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390312 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390327 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390343 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390365 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390377 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390410 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390430 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390446 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390499 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390382 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390565 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390590 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390598 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390617 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390620 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390683 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390593 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.390526 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.518226 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.519900 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.519945 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.519959 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.519991 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.520586 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.567551 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.575776 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.598123 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.621987 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-eef42b60f7b2054a8d9add9fd7708d0af9b4402db06f3d1065306dc0d5ac7fba WatchSource:0}: Error finding container eef42b60f7b2054a8d9add9fd7708d0af9b4402db06f3d1065306dc0d5ac7fba: Status 404 returned error can't find the container with id eef42b60f7b2054a8d9add9fd7708d0af9b4402db06f3d1065306dc0d5ac7fba Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.622304 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.628055 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1db29d721481765b773f739a09a88059d140c9ad6a4dda87299febe69b9d225a WatchSource:0}: Error finding container 1db29d721481765b773f739a09a88059d140c9ad6a4dda87299febe69b9d225a: Status 404 returned error can't find the container with id 1db29d721481765b773f739a09a88059d140c9ad6a4dda87299febe69b9d225a Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.631015 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.638329 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-48c11dee69a922097aacdbebef30bb84fadb43b6f2926039f4f41c73e18a30b4 WatchSource:0}: Error finding container 48c11dee69a922097aacdbebef30bb84fadb43b6f2926039f4f41c73e18a30b4: Status 404 returned error can't find the container with id 48c11dee69a922097aacdbebef30bb84fadb43b6f2926039f4f41c73e18a30b4 Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.653851 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7f94e77b6ec7093729d04a1348aca6f9747437ed4c6fb8fd672ca9dd936877b8 WatchSource:0}: Error finding container 7f94e77b6ec7093729d04a1348aca6f9747437ed4c6fb8fd672ca9dd936877b8: Status 404 returned error can't find the container with id 7f94e77b6ec7093729d04a1348aca6f9747437ed4c6fb8fd672ca9dd936877b8 Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.665580 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="800ms" Mar 18 15:37:16 crc kubenswrapper[4939]: W0318 15:37:16.852251 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.852382 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.921516 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.923727 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.923786 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.923801 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:16 crc kubenswrapper[4939]: I0318 15:37:16.923846 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:16 crc kubenswrapper[4939]: E0318 15:37:16.924318 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.047675 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:17 crc kubenswrapper[4939]: W0318 15:37:17.138149 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:17 crc kubenswrapper[4939]: E0318 15:37:17.138238 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.140095 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"48c11dee69a922097aacdbebef30bb84fadb43b6f2926039f4f41c73e18a30b4"} Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.141032 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1db29d721481765b773f739a09a88059d140c9ad6a4dda87299febe69b9d225a"} Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.141935 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ee0e6f0d52fd5b031606285b06c8f1a89254e3cf27157bea60d46091e6316847"} Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.143022 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eef42b60f7b2054a8d9add9fd7708d0af9b4402db06f3d1065306dc0d5ac7fba"} Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.144854 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f94e77b6ec7093729d04a1348aca6f9747437ed4c6fb8fd672ca9dd936877b8"} Mar 18 15:37:17 crc kubenswrapper[4939]: W0318 15:37:17.226340 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:17 crc kubenswrapper[4939]: E0318 15:37:17.226574 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:17 crc kubenswrapper[4939]: W0318 15:37:17.434477 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:17 crc kubenswrapper[4939]: E0318 15:37:17.434644 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:17 crc kubenswrapper[4939]: E0318 15:37:17.467365 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="1.6s" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.725476 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.727700 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.727750 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.727766 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:17 crc kubenswrapper[4939]: I0318 15:37:17.727803 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:17 crc kubenswrapper[4939]: E0318 15:37:17.728435 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.047890 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.100997 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:37:18 crc kubenswrapper[4939]: E0318 15:37:18.102367 4939 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.151329 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44" exitCode=0 Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.151426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.151600 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.153320 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.153353 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.153369 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.153969 4939 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080" exitCode=0 Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.154082 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.154212 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.155389 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.155426 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.155438 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.156550 4939 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442" exitCode=0 Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.156661 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.156645 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.156931 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158347 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158373 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158382 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158423 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158453 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.158467 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.160453 4939 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4" exitCode=0 Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.160619 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.160541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.161820 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.161872 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.161882 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.165846 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.165902 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.165917 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.165931 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856"} Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.165962 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.167350 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.167376 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.167386 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:18 crc kubenswrapper[4939]: I0318 15:37:18.585220 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:19 crc kubenswrapper[4939]: W0318 15:37:19.008612 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:19 crc kubenswrapper[4939]: E0318 15:37:19.008726 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.047939 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:19 crc kubenswrapper[4939]: E0318 15:37:19.069910 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="3.2s" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.174726 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.174774 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.174785 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.174794 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.176978 4939 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512" exitCode=0 Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.177101 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.177083 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.178223 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.178257 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.178266 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.185185 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.185166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.190822 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.190866 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.190879 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.199631 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.199660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.199714 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.199727 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8"} Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.199633 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200777 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200813 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200824 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200847 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200869 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.200879 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.329498 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.331390 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.331453 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.331472 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.331551 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:19 crc kubenswrapper[4939]: E0318 15:37:19.332201 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Mar 18 15:37:19 crc kubenswrapper[4939]: W0318 15:37:19.453949 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:19 crc kubenswrapper[4939]: E0318 15:37:19.454064 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:19 crc kubenswrapper[4939]: I0318 15:37:19.509999 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.047353 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:20 crc kubenswrapper[4939]: W0318 15:37:20.106188 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:20 crc kubenswrapper[4939]: E0318 15:37:20.106333 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:20 crc kubenswrapper[4939]: W0318 15:37:20.113253 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Mar 18 15:37:20 crc kubenswrapper[4939]: E0318 15:37:20.113322 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.205295 4939 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd" exitCode=0 Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.205379 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd"} Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.205567 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.206608 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.206663 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.206676 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.208269 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210442 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fc432e6a8d0d2f8ca6d1f226d7f49d58abb82b6bc0554da7c76079df738a7ebd" exitCode=255 Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210592 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210717 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fc432e6a8d0d2f8ca6d1f226d7f49d58abb82b6bc0554da7c76079df738a7ebd"} Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210801 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210838 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.210805 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.211031 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.214903 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215007 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215096 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215043 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215257 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215278 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215055 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215339 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215356 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215016 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215471 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.215523 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:20 crc kubenswrapper[4939]: I0318 15:37:20.216487 4939 scope.go:117] "RemoveContainer" containerID="fc432e6a8d0d2f8ca6d1f226d7f49d58abb82b6bc0554da7c76079df738a7ebd" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.215383 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.217395 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.217625 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88"} Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.217763 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.218676 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.218723 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.218740 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.223438 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3"} Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.223531 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.223634 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.223823 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee"} Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.223859 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e"} Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.224669 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.224753 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.224768 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.224941 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.224990 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.225005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:21 crc kubenswrapper[4939]: I0318 15:37:21.819781 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.231935 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.232862 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.232930 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140"} Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.232986 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e"} Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.233025 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.233154 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234024 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234051 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234054 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234093 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234119 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234062 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234331 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234348 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.234358 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.258550 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.533276 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.535100 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.535204 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.535230 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.535273 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:22 crc kubenswrapper[4939]: I0318 15:37:22.692145 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.046804 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.235476 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.235600 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237204 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237256 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237266 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237322 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237343 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:23 crc kubenswrapper[4939]: I0318 15:37:23.237277 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.238768 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.238874 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240724 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240780 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240803 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240736 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240913 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.240946 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.615940 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.820445 4939 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:37:24 crc kubenswrapper[4939]: I0318 15:37:24.820658 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:37:25 crc kubenswrapper[4939]: I0318 15:37:25.241119 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:25 crc kubenswrapper[4939]: I0318 15:37:25.242615 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:25 crc kubenswrapper[4939]: I0318 15:37:25.242675 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:25 crc kubenswrapper[4939]: I0318 15:37:25.242693 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:26 crc kubenswrapper[4939]: E0318 15:37:26.226998 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.826987 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.827254 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.829326 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.829413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.829432 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:27 crc kubenswrapper[4939]: I0318 15:37:27.835602 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:28 crc kubenswrapper[4939]: I0318 15:37:28.249023 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:28 crc kubenswrapper[4939]: I0318 15:37:28.250218 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:28 crc kubenswrapper[4939]: I0318 15:37:28.250261 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:28 crc kubenswrapper[4939]: I0318 15:37:28.250273 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:28 crc kubenswrapper[4939]: I0318 15:37:28.255194 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:29 crc kubenswrapper[4939]: I0318 15:37:29.255643 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:29 crc kubenswrapper[4939]: I0318 15:37:29.257142 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:29 crc kubenswrapper[4939]: I0318 15:37:29.257192 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:29 crc kubenswrapper[4939]: I0318 15:37:29.257220 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.016998 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.017219 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.018722 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.018765 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.018780 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.069015 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.261587 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.262544 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.262570 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.262579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.274107 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.954959 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 15:37:30 crc kubenswrapper[4939]: W0318 15:37:30.961907 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.962002 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.962244 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:37:30 crc kubenswrapper[4939]: W0318 15:37:30.965321 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.965385 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:30 crc kubenswrapper[4939]: W0318 15:37:30.965491 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.965699 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.965713 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.966749 4939 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:30 crc kubenswrapper[4939]: W0318 15:37:30.971213 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z Mar 18 15:37:30 crc kubenswrapper[4939]: E0318 15:37:30.971290 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.981958 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:30Z is after 2026-02-23T05:33:13Z Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.983496 4939 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:37:30 crc kubenswrapper[4939]: I0318 15:37:30.983616 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.003004 4939 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.003119 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.049851 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:31Z is after 2026-02-23T05:33:13Z Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.264818 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.265701 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.265746 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:31 crc kubenswrapper[4939]: I0318 15:37:31.265761 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.050159 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:32Z is after 2026-02-23T05:33:13Z Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.270224 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.271823 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.274497 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" exitCode=255 Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.274598 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88"} Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.275140 4939 scope.go:117] "RemoveContainer" containerID="fc432e6a8d0d2f8ca6d1f226d7f49d58abb82b6bc0554da7c76079df738a7ebd" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.275370 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.276875 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.276948 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.277020 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:32 crc kubenswrapper[4939]: I0318 15:37:32.278128 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:32 crc kubenswrapper[4939]: E0318 15:37:32.278545 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:33 crc kubenswrapper[4939]: I0318 15:37:33.052763 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:33Z is after 2026-02-23T05:33:13Z Mar 18 15:37:33 crc kubenswrapper[4939]: I0318 15:37:33.280931 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.050594 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:34Z is after 2026-02-23T05:33:13Z Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.625200 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.625407 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.626621 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.626679 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.626691 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.627191 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:34 crc kubenswrapper[4939]: E0318 15:37:34.627353 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.629177 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.820713 4939 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:37:34 crc kubenswrapper[4939]: I0318 15:37:34.820848 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.049968 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:35Z is after 2026-02-23T05:33:13Z Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.290759 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.292471 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.292564 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.292587 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:35 crc kubenswrapper[4939]: I0318 15:37:35.293654 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:35 crc kubenswrapper[4939]: E0318 15:37:35.293993 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:36 crc kubenswrapper[4939]: I0318 15:37:36.051774 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:36Z is after 2026-02-23T05:33:13Z Mar 18 15:37:36 crc kubenswrapper[4939]: E0318 15:37:36.227239 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.052480 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2026-02-23T05:33:13Z Mar 18 15:37:37 crc kubenswrapper[4939]: E0318 15:37:37.361084 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.363370 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.365432 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.365497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.365542 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.365595 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:37 crc kubenswrapper[4939]: E0318 15:37:37.370576 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.605167 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.605440 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.607202 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.607260 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.607279 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:37 crc kubenswrapper[4939]: I0318 15:37:37.608197 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:37 crc kubenswrapper[4939]: E0318 15:37:37.608497 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:38 crc kubenswrapper[4939]: I0318 15:37:38.055450 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:38Z is after 2026-02-23T05:33:13Z Mar 18 15:37:39 crc kubenswrapper[4939]: I0318 15:37:39.054333 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z Mar 18 15:37:39 crc kubenswrapper[4939]: I0318 15:37:39.473367 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:37:39 crc kubenswrapper[4939]: E0318 15:37:39.481029 4939 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:39 crc kubenswrapper[4939]: W0318 15:37:39.529481 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z Mar 18 15:37:39 crc kubenswrapper[4939]: E0318 15:37:39.529769 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:39 crc kubenswrapper[4939]: W0318 15:37:39.964276 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z Mar 18 15:37:39 crc kubenswrapper[4939]: E0318 15:37:39.964411 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:40 crc kubenswrapper[4939]: I0318 15:37:40.051642 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:40Z is after 2026-02-23T05:33:13Z Mar 18 15:37:40 crc kubenswrapper[4939]: E0318 15:37:40.972050 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:37:41 crc kubenswrapper[4939]: I0318 15:37:41.053003 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:41Z is after 2026-02-23T05:33:13Z Mar 18 15:37:42 crc kubenswrapper[4939]: I0318 15:37:42.050401 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2026-02-23T05:33:13Z Mar 18 15:37:42 crc kubenswrapper[4939]: W0318 15:37:42.692185 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2026-02-23T05:33:13Z Mar 18 15:37:42 crc kubenswrapper[4939]: E0318 15:37:42.692347 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:43 crc kubenswrapper[4939]: I0318 15:37:43.051982 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2026-02-23T05:33:13Z Mar 18 15:37:43 crc kubenswrapper[4939]: W0318 15:37:43.399024 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2026-02-23T05:33:13Z Mar 18 15:37:43 crc kubenswrapper[4939]: E0318 15:37:43.399120 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.051156 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z Mar 18 15:37:44 crc kubenswrapper[4939]: E0318 15:37:44.367057 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.371352 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.372956 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.373003 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.373015 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.373047 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:44 crc kubenswrapper[4939]: E0318 15:37:44.377938 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.820640 4939 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.820778 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.821032 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.821331 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.823290 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.823370 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.823398 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.824671 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 15:37:44 crc kubenswrapper[4939]: I0318 15:37:44.825008 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5" gracePeriod=30 Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.053063 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:45Z is after 2026-02-23T05:33:13Z Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.328245 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.329168 4939 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5" exitCode=255 Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.329302 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5"} Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.329355 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980"} Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.329486 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.330839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.330968 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:45 crc kubenswrapper[4939]: I0318 15:37:45.330999 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:46 crc kubenswrapper[4939]: I0318 15:37:46.051956 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:46Z is after 2026-02-23T05:33:13Z Mar 18 15:37:46 crc kubenswrapper[4939]: E0318 15:37:46.227678 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:47 crc kubenswrapper[4939]: I0318 15:37:47.050225 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:47Z is after 2026-02-23T05:33:13Z Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.052396 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:48Z is after 2026-02-23T05:33:13Z Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.586194 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.586451 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.588363 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.588424 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:48 crc kubenswrapper[4939]: I0318 15:37:48.588447 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.051141 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:49Z is after 2026-02-23T05:33:13Z Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.132843 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.134668 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.134731 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.134747 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:49 crc kubenswrapper[4939]: I0318 15:37:49.135593 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.049213 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2026-02-23T05:33:13Z Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.348359 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.349404 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.352449 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" exitCode=255 Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.352565 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace"} Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.352643 4939 scope.go:117] "RemoveContainer" containerID="013c2abf7ff6c369b9d4c62a31b4b42e7775e7536ba0d58fdeb55bb491c22c88" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.352826 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.354051 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.354113 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.354138 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:50 crc kubenswrapper[4939]: I0318 15:37:50.355640 4939 scope.go:117] "RemoveContainer" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" Mar 18 15:37:50 crc kubenswrapper[4939]: E0318 15:37:50.356023 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:50 crc kubenswrapper[4939]: E0318 15:37:50.977409 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.051742 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:51Z is after 2026-02-23T05:33:13Z Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.358372 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:37:51 crc kubenswrapper[4939]: E0318 15:37:51.372732 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:51Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.378256 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.380361 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.380412 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.380428 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.380462 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:51 crc kubenswrapper[4939]: E0318 15:37:51.385131 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:51Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.819938 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.820206 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.821915 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.821980 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:51 crc kubenswrapper[4939]: I0318 15:37:51.821999 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:52 crc kubenswrapper[4939]: I0318 15:37:52.052056 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:52Z is after 2026-02-23T05:33:13Z Mar 18 15:37:53 crc kubenswrapper[4939]: I0318 15:37:53.050147 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2026-02-23T05:33:13Z Mar 18 15:37:53 crc kubenswrapper[4939]: W0318 15:37:53.778750 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2026-02-23T05:33:13Z Mar 18 15:37:53 crc kubenswrapper[4939]: E0318 15:37:53.780797 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:54 crc kubenswrapper[4939]: I0318 15:37:54.051916 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:54Z is after 2026-02-23T05:33:13Z Mar 18 15:37:54 crc kubenswrapper[4939]: I0318 15:37:54.820929 4939 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:37:54 crc kubenswrapper[4939]: I0318 15:37:54.821410 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:37:55 crc kubenswrapper[4939]: I0318 15:37:55.052285 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2026-02-23T05:33:13Z Mar 18 15:37:55 crc kubenswrapper[4939]: I0318 15:37:55.812010 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:37:55 crc kubenswrapper[4939]: E0318 15:37:55.818026 4939 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:55 crc kubenswrapper[4939]: E0318 15:37:55.819371 4939 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.050885 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:56Z is after 2026-02-23T05:33:13Z Mar 18 15:37:56 crc kubenswrapper[4939]: E0318 15:37:56.227893 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.833768 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.834087 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.836793 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.836869 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.836890 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:56 crc kubenswrapper[4939]: I0318 15:37:56.838071 4939 scope.go:117] "RemoveContainer" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" Mar 18 15:37:56 crc kubenswrapper[4939]: E0318 15:37:56.838438 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.052090 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:57Z is after 2026-02-23T05:33:13Z Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.605423 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.605795 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.607930 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.607994 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.608018 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:57 crc kubenswrapper[4939]: I0318 15:37:57.609193 4939 scope.go:117] "RemoveContainer" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" Mar 18 15:37:57 crc kubenswrapper[4939]: E0318 15:37:57.609593 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.052328 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2026-02-23T05:33:13Z Mar 18 15:37:58 crc kubenswrapper[4939]: W0318 15:37:58.077292 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2026-02-23T05:33:13Z Mar 18 15:37:58 crc kubenswrapper[4939]: E0318 15:37:58.077420 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:37:58 crc kubenswrapper[4939]: E0318 15:37:58.375907 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.385746 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.389532 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.389584 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.389596 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:58 crc kubenswrapper[4939]: I0318 15:37:58.389632 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:37:58 crc kubenswrapper[4939]: E0318 15:37:58.393026 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:37:59 crc kubenswrapper[4939]: I0318 15:37:59.052291 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:59Z is after 2026-02-23T05:33:13Z Mar 18 15:38:00 crc kubenswrapper[4939]: I0318 15:38:00.052635 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:00Z is after 2026-02-23T05:33:13Z Mar 18 15:38:00 crc kubenswrapper[4939]: E0318 15:38:00.983831 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:01 crc kubenswrapper[4939]: I0318 15:38:01.052202 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:01Z is after 2026-02-23T05:33:13Z Mar 18 15:38:02 crc kubenswrapper[4939]: I0318 15:38:02.054469 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:03 crc kubenswrapper[4939]: I0318 15:38:03.057310 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:04 crc kubenswrapper[4939]: I0318 15:38:04.054639 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:04 crc kubenswrapper[4939]: W0318 15:38:04.445137 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 15:38:04 crc kubenswrapper[4939]: E0318 15:38:04.445214 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:38:04 crc kubenswrapper[4939]: I0318 15:38:04.820444 4939 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:38:04 crc kubenswrapper[4939]: I0318 15:38:04.820566 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.053323 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:05 crc kubenswrapper[4939]: E0318 15:38:05.384238 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.393230 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.396213 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.396255 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.396269 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:05 crc kubenswrapper[4939]: I0318 15:38:05.396308 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:38:05 crc kubenswrapper[4939]: E0318 15:38:05.402820 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:38:06 crc kubenswrapper[4939]: I0318 15:38:06.053623 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:06 crc kubenswrapper[4939]: E0318 15:38:06.228016 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:38:07 crc kubenswrapper[4939]: I0318 15:38:07.049044 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.057677 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:08 crc kubenswrapper[4939]: W0318 15:38:08.265409 4939 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:08 crc kubenswrapper[4939]: E0318 15:38:08.265545 4939 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.638533 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.638733 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.640222 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.640361 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:08 crc kubenswrapper[4939]: I0318 15:38:08.640463 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:09 crc kubenswrapper[4939]: I0318 15:38:09.053917 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:10 crc kubenswrapper[4939]: I0318 15:38:10.051006 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:10 crc kubenswrapper[4939]: E0318 15:38:10.993295 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ad6dee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,LastTimestamp:2026-03-18 15:37:16.043722473 +0000 UTC m=+0.642910124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.002584 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.009606 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.016624 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.026273 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df99294fd8739 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.214028089 +0000 UTC m=+0.813215750,LastTimestamp:2026-03-18 15:37:16.214028089 +0000 UTC m=+0.813215750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.032933 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.233850001 +0000 UTC m=+0.833037632,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.037900 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.233895602 +0000 UTC m=+0.833083233,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.042769 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.233909932 +0000 UTC m=+0.833097563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.047564 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.235126757 +0000 UTC m=+0.834314398,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.047774 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.052486 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.235150598 +0000 UTC m=+0.834338239,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.060799 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.235165468 +0000 UTC m=+0.834353099,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.068885 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.236182657 +0000 UTC m=+0.835370288,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.073955 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.236196977 +0000 UTC m=+0.835384608,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.078564 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.236210068 +0000 UTC m=+0.835397699,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.083711 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.236890437 +0000 UTC m=+0.836078098,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.089580 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.236918018 +0000 UTC m=+0.836105679,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.095293 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.236935928 +0000 UTC m=+0.836123589,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.100546 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.237022981 +0000 UTC m=+0.836210652,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.106076 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.237050641 +0000 UTC m=+0.836238302,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.111640 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.237069942 +0000 UTC m=+0.836257603,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.117024 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.237165645 +0000 UTC m=+0.836353276,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.123866 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.237183425 +0000 UTC m=+0.836371056,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.129456 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea780bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea780bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107727037 +0000 UTC m=+0.706914668,LastTimestamp:2026-03-18 15:37:16.237195035 +0000 UTC m=+0.836382666,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.131874 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea6d37d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea6d37d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107682685 +0000 UTC m=+0.706870326,LastTimestamp:2026-03-18 15:37:16.237766562 +0000 UTC m=+0.836954193,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.132915 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.134183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.134217 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.134231 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.134879 4939 scope.go:117] "RemoveContainer" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.137491 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df9928ea74944\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df9928ea74944 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.107712836 +0000 UTC m=+0.706900467,LastTimestamp:2026-03-18 15:37:16.237796853 +0000 UTC m=+0.836984484,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.148082 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df992adbadbce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.62908923 +0000 UTC m=+1.228276851,LastTimestamp:2026-03-18 15:37:16.62908923 +0000 UTC m=+1.228276851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.155852 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992adbddec4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.629286596 +0000 UTC m=+1.228474217,LastTimestamp:2026-03-18 15:37:16.629286596 +0000 UTC m=+1.228474217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.166533 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df992adf19224 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.632674852 +0000 UTC m=+1.231862463,LastTimestamp:2026-03-18 15:37:16.632674852 +0000 UTC m=+1.231862463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.171917 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df992aef20362 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.649481058 +0000 UTC m=+1.248668679,LastTimestamp:2026-03-18 15:37:16.649481058 +0000 UTC m=+1.248668679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.177917 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df992af88e476 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:16.659369078 +0000 UTC m=+1.258556699,LastTimestamp:2026-03-18 15:37:16.659369078 +0000 UTC m=+1.258556699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.182723 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df992d06d4e14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.211209236 +0000 UTC m=+1.810396847,LastTimestamp:2026-03-18 15:37:17.211209236 +0000 UTC m=+1.810396847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.186840 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df992d070804e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.211418702 +0000 UTC m=+1.810606323,LastTimestamp:2026-03-18 15:37:17.211418702 +0000 UTC m=+1.810606323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.190645 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992d0723e41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.211532865 +0000 UTC m=+1.810720486,LastTimestamp:2026-03-18 15:37:17.211532865 +0000 UTC m=+1.810720486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.196679 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df992d0d62e89 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.218082441 +0000 UTC m=+1.817270072,LastTimestamp:2026-03-18 15:37:17.218082441 +0000 UTC m=+1.817270072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.201388 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df992d10235c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.220967873 +0000 UTC m=+1.820155494,LastTimestamp:2026-03-18 15:37:17.220967873 +0000 UTC m=+1.820155494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.206425 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df992d1337312 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.224194834 +0000 UTC m=+1.823382445,LastTimestamp:2026-03-18 15:37:17.224194834 +0000 UTC m=+1.823382445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.211239 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992d142b154 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.225193812 +0000 UTC m=+1.824381433,LastTimestamp:2026-03-18 15:37:17.225193812 +0000 UTC m=+1.824381433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.216403 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df992d14a34c8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.225686216 +0000 UTC m=+1.824873827,LastTimestamp:2026-03-18 15:37:17.225686216 +0000 UTC m=+1.824873827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.220928 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992d155f094 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.226455188 +0000 UTC m=+1.825642819,LastTimestamp:2026-03-18 15:37:17.226455188 +0000 UTC m=+1.825642819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.228893 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df992d1e5dc01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.235887105 +0000 UTC m=+1.835074726,LastTimestamp:2026-03-18 15:37:17.235887105 +0000 UTC m=+1.835074726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.236097 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df992d1e885aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.23606161 +0000 UTC m=+1.835249231,LastTimestamp:2026-03-18 15:37:17.23606161 +0000 UTC m=+1.835249231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.241474 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992e50a7806 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.557053446 +0000 UTC m=+2.156241067,LastTimestamp:2026-03-18 15:37:17.557053446 +0000 UTC m=+2.156241067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.245316 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992e6318ef2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.576392434 +0000 UTC m=+2.175580055,LastTimestamp:2026-03-18 15:37:17.576392434 +0000 UTC m=+2.175580055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.250578 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992e64a5c30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.57801784 +0000 UTC m=+2.177205461,LastTimestamp:2026-03-18 15:37:17.57801784 +0000 UTC m=+2.177205461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.255640 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992f27fb2fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.78284006 +0000 UTC m=+2.382027681,LastTimestamp:2026-03-18 15:37:17.78284006 +0000 UTC m=+2.382027681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.260554 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992f3694df3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.798149619 +0000 UTC m=+2.397337240,LastTimestamp:2026-03-18 15:37:17.798149619 +0000 UTC m=+2.397337240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.265151 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992f3804d14 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.799656724 +0000 UTC m=+2.398844345,LastTimestamp:2026-03-18 15:37:17.799656724 +0000 UTC m=+2.398844345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.268465 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df99301277e85 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.028717701 +0000 UTC m=+2.627905332,LastTimestamp:2026-03-18 15:37:18.028717701 +0000 UTC m=+2.627905332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.272128 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df99302ff4c07 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.059637767 +0000 UTC m=+2.658825388,LastTimestamp:2026-03-18 15:37:18.059637767 +0000 UTC m=+2.658825388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.276414 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99308c48872 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.156449906 +0000 UTC m=+2.755637567,LastTimestamp:2026-03-18 15:37:18.156449906 +0000 UTC m=+2.755637567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.280453 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df99308d3b0e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.157443305 +0000 UTC m=+2.756630926,LastTimestamp:2026-03-18 15:37:18.157443305 +0000 UTC m=+2.756630926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.286469 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df99308fd0f75 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.160154485 +0000 UTC m=+2.759342116,LastTimestamp:2026-03-18 15:37:18.160154485 +0000 UTC m=+2.759342116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.291287 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9930929b4f9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.163080441 +0000 UTC m=+2.762268082,LastTimestamp:2026-03-18 15:37:18.163080441 +0000 UTC m=+2.762268082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.295198 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9931535ed46 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.365207878 +0000 UTC m=+2.964395499,LastTimestamp:2026-03-18 15:37:18.365207878 +0000 UTC m=+2.964395499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.300360 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9931569077e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.368556926 +0000 UTC m=+2.967744557,LastTimestamp:2026-03-18 15:37:18.368556926 +0000 UTC m=+2.967744557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.304701 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df993156a0bd0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.368623568 +0000 UTC m=+2.967811189,LastTimestamp:2026-03-18 15:37:18.368623568 +0000 UTC m=+2.967811189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.309177 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9931570d837 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.369069111 +0000 UTC m=+2.968256722,LastTimestamp:2026-03-18 15:37:18.369069111 +0000 UTC m=+2.968256722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.312777 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df993167d3253 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.386655827 +0000 UTC m=+2.985843448,LastTimestamp:2026-03-18 15:37:18.386655827 +0000 UTC m=+2.985843448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.318782 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df993169dd556 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.38879471 +0000 UTC m=+2.987982321,LastTimestamp:2026-03-18 15:37:18.38879471 +0000 UTC m=+2.987982321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.329228 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99316ba7c5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.390672475 +0000 UTC m=+2.989860096,LastTimestamp:2026-03-18 15:37:18.390672475 +0000 UTC m=+2.989860096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.334929 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df99316cffc76 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.392081526 +0000 UTC m=+2.991269147,LastTimestamp:2026-03-18 15:37:18.392081526 +0000 UTC m=+2.991269147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.340746 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99316d278c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.392244421 +0000 UTC m=+2.991432042,LastTimestamp:2026-03-18 15:37:18.392244421 +0000 UTC m=+2.991432042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.344026 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df99317d55ba4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.409210788 +0000 UTC m=+3.008398409,LastTimestamp:2026-03-18 15:37:18.409210788 +0000 UTC m=+3.008398409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.349326 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df993215ee43a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.56921913 +0000 UTC m=+3.168406751,LastTimestamp:2026-03-18 15:37:18.56921913 +0000 UTC m=+3.168406751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.353378 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df993227facf5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.588144885 +0000 UTC m=+3.187332526,LastTimestamp:2026-03-18 15:37:18.588144885 +0000 UTC m=+3.187332526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.357678 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df99322cb38f0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.59309592 +0000 UTC m=+3.192283541,LastTimestamp:2026-03-18 15:37:18.59309592 +0000 UTC m=+3.192283541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.361632 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99322cc5003 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.593167363 +0000 UTC m=+3.192355004,LastTimestamp:2026-03-18 15:37:18.593167363 +0000 UTC m=+3.192355004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.365272 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9932517bebe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.631665342 +0000 UTC m=+3.230852963,LastTimestamp:2026-03-18 15:37:18.631665342 +0000 UTC m=+3.230852963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.371027 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df993252bdb68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.6329834 +0000 UTC m=+3.232171031,LastTimestamp:2026-03-18 15:37:18.6329834 +0000 UTC m=+3.232171031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.374667 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9932f3e01a3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.801944995 +0000 UTC m=+3.401132616,LastTimestamp:2026-03-18 15:37:18.801944995 +0000 UTC m=+3.401132616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.380228 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9932f5e61a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.804066727 +0000 UTC m=+3.403254348,LastTimestamp:2026-03-18 15:37:18.804066727 +0000 UTC m=+3.403254348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.386032 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99330141cf8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.815976696 +0000 UTC m=+3.415164327,LastTimestamp:2026-03-18 15:37:18.815976696 +0000 UTC m=+3.415164327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.390269 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df99330191ad2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.816303826 +0000 UTC m=+3.415491467,LastTimestamp:2026-03-18 15:37:18.816303826 +0000 UTC m=+3.415491467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.395012 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99330266179 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:18.817173881 +0000 UTC m=+3.416361512,LastTimestamp:2026-03-18 15:37:18.817173881 +0000 UTC m=+3.416361512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.401375 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9933c102674 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.017043572 +0000 UTC m=+3.616231183,LastTimestamp:2026-03-18 15:37:19.017043572 +0000 UTC m=+3.616231183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.405838 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9933d2b623f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.035605567 +0000 UTC m=+3.634793198,LastTimestamp:2026-03-18 15:37:19.035605567 +0000 UTC m=+3.634793198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.410762 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9933d3f87db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.036925915 +0000 UTC m=+3.636113536,LastTimestamp:2026-03-18 15:37:19.036925915 +0000 UTC m=+3.636113536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.414755 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df99345cccd71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.180402033 +0000 UTC m=+3.779589644,LastTimestamp:2026-03-18 15:37:19.180402033 +0000 UTC m=+3.779589644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.420876 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9934f56f2e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.340450536 +0000 UTC m=+3.939638167,LastTimestamp:2026-03-18 15:37:19.340450536 +0000 UTC m=+3.939638167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.424592 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99350204b31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.353645873 +0000 UTC m=+3.952833494,LastTimestamp:2026-03-18 15:37:19.353645873 +0000 UTC m=+3.952833494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.428636 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993524f3d71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.390276977 +0000 UTC m=+3.989464598,LastTimestamp:2026-03-18 15:37:19.390276977 +0000 UTC m=+3.989464598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.432223 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df99353517cdf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.407201503 +0000 UTC m=+4.006389124,LastTimestamp:2026-03-18 15:37:19.407201503 +0000 UTC m=+4.006389124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.438841 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993831e44b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.20915116 +0000 UTC m=+4.808338781,LastTimestamp:2026-03-18 15:37:20.20915116 +0000 UTC m=+4.808338781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.442745 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.445187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550"} Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.445367 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.445926 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df9933d3f87db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9933d3f87db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.036925915 +0000 UTC m=+3.636113536,LastTimestamp:2026-03-18 15:37:20.217798424 +0000 UTC m=+4.816986035,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.446221 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.446355 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.446368 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.448116 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9938f8156c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.416970434 +0000 UTC m=+5.016158055,LastTimestamp:2026-03-18 15:37:20.416970434 +0000 UTC m=+5.016158055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.452004 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df9934f56f2e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9934f56f2e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.340450536 +0000 UTC m=+3.939638167,LastTimestamp:2026-03-18 15:37:20.423089644 +0000 UTC m=+5.022277265,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.455879 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993901ab899 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.427022489 +0000 UTC m=+5.026210110,LastTimestamp:2026-03-18 15:37:20.427022489 +0000 UTC m=+5.026210110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.459271 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df99390301a32 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.42842373 +0000 UTC m=+5.027611351,LastTimestamp:2026-03-18 15:37:20.42842373 +0000 UTC m=+5.027611351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.464365 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df99350204b31\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df99350204b31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:19.353645873 +0000 UTC m=+3.952833494,LastTimestamp:2026-03-18 15:37:20.433663044 +0000 UTC m=+5.032850665,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.468087 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9939fb81a37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.688994871 +0000 UTC m=+5.288182492,LastTimestamp:2026-03-18 15:37:20.688994871 +0000 UTC m=+5.288182492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.471985 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993a1026442 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.710640706 +0000 UTC m=+5.309828327,LastTimestamp:2026-03-18 15:37:20.710640706 +0000 UTC m=+5.309828327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.476464 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993a113e1b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.711786929 +0000 UTC m=+5.310974550,LastTimestamp:2026-03-18 15:37:20.711786929 +0000 UTC m=+5.310974550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.480860 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993af6e7c73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.952605811 +0000 UTC m=+5.551793432,LastTimestamp:2026-03-18 15:37:20.952605811 +0000 UTC m=+5.551793432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.485126 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993b02b7ea2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.964992674 +0000 UTC m=+5.564180295,LastTimestamp:2026-03-18 15:37:20.964992674 +0000 UTC m=+5.564180295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.488812 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993b03bc9f8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:20.966060536 +0000 UTC m=+5.565248187,LastTimestamp:2026-03-18 15:37:20.966060536 +0000 UTC m=+5.565248187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.494076 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993be740502 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:21.20462669 +0000 UTC m=+5.803814311,LastTimestamp:2026-03-18 15:37:21.20462669 +0000 UTC m=+5.803814311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.498054 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993bf76eed6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:21.221594838 +0000 UTC m=+5.820782459,LastTimestamp:2026-03-18 15:37:21.221594838 +0000 UTC m=+5.820782459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.503730 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993bf8ae0d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:21.222901976 +0000 UTC m=+5.822089597,LastTimestamp:2026-03-18 15:37:21.222901976 +0000 UTC m=+5.822089597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.508173 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993ceca6944 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:21.478723908 +0000 UTC m=+6.077911569,LastTimestamp:2026-03-18 15:37:21.478723908 +0000 UTC m=+6.077911569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.511790 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df993cf980395 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:21.492198293 +0000 UTC m=+6.091385964,LastTimestamp:2026-03-18 15:37:21.492198293 +0000 UTC m=+6.091385964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.516849 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-controller-manager-crc.189df99495fb9ed1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 15:38:11 crc kubenswrapper[4939]: body: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:24.820614865 +0000 UTC m=+9.419802576,LastTimestamp:2026-03-18 15:37:24.820614865 +0000 UTC m=+9.419802576,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.521606 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df99495fd2fd8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:24.820717528 +0000 UTC m=+9.419905189,LastTimestamp:2026-03-18 15:37:24.820717528 +0000 UTC m=+9.419905189,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.527008 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-apiserver-crc.189df9960552dd8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:38:11 crc kubenswrapper[4939]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:38:11 crc kubenswrapper[4939]: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:30.983570828 +0000 UTC m=+15.582758469,LastTimestamp:2026-03-18 15:37:30.983570828 +0000 UTC m=+15.582758469,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.531864 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df996055432e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:30.98365821 +0000 UTC m=+15.582845841,LastTimestamp:2026-03-18 15:37:30.98365821 +0000 UTC m=+15.582845841,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.537265 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-apiserver-crc.189df996067cb694 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:38:11 crc kubenswrapper[4939]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 18 15:38:11 crc kubenswrapper[4939]: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:31.00309058 +0000 UTC m=+15.602278211,LastTimestamp:2026-03-18 15:37:31.00309058 +0000 UTC m=+15.602278211,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.541916 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df996055432e2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df996055432e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:30.98365821 +0000 UTC m=+15.582845841,LastTimestamp:2026-03-18 15:37:31.003158202 +0000 UTC m=+15.602345823,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.549024 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0aa2b9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:38:11 crc kubenswrapper[4939]: body: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820819641 +0000 UTC m=+19.420007302,LastTimestamp:2026-03-18 15:37:34.820819641 +0000 UTC m=+19.420007302,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.552350 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0be6cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820902603 +0000 UTC m=+19.420090254,LastTimestamp:2026-03-18 15:37:34.820902603 +0000 UTC m=+19.420090254,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.557481 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df996ea0aa2b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0aa2b9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:38:11 crc kubenswrapper[4939]: body: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820819641 +0000 UTC m=+19.420007302,LastTimestamp:2026-03-18 15:37:44.820732537 +0000 UTC m=+29.419920198,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.561187 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df996ea0be6cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0be6cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820902603 +0000 UTC m=+19.420090254,LastTimestamp:2026-03-18 15:37:44.820930163 +0000 UTC m=+29.420117824,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.565823 4939 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df9993e55dd73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:44.824970611 +0000 UTC m=+29.424158272,LastTimestamp:2026-03-18 15:37:44.824970611 +0000 UTC m=+29.424158272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.569743 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df992d155f094\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992d155f094 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.226455188 +0000 UTC m=+1.825642819,LastTimestamp:2026-03-18 15:37:44.951940294 +0000 UTC m=+29.551127955,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.573899 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df992e50a7806\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992e50a7806 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.557053446 +0000 UTC m=+2.156241067,LastTimestamp:2026-03-18 15:37:45.219085078 +0000 UTC m=+29.818272729,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.577435 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df992e6318ef2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df992e6318ef2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:17.576392434 +0000 UTC m=+2.175580055,LastTimestamp:2026-03-18 15:37:45.231898954 +0000 UTC m=+29.831086585,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.583154 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df996ea0aa2b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0aa2b9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:38:11 crc kubenswrapper[4939]: body: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820819641 +0000 UTC m=+19.420007302,LastTimestamp:2026-03-18 15:37:54.821364444 +0000 UTC m=+39.420552125,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.587305 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df996ea0be6cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0be6cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820902603 +0000 UTC m=+19.420090254,LastTimestamp:2026-03-18 15:37:54.821669333 +0000 UTC m=+39.420857024,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:38:11 crc kubenswrapper[4939]: E0318 15:38:11.592691 4939 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df996ea0aa2b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:38:11 crc kubenswrapper[4939]: &Event{ObjectMeta:{kube-controller-manager-crc.189df996ea0aa2b9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:38:11 crc kubenswrapper[4939]: body: Mar 18 15:38:11 crc kubenswrapper[4939]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:37:34.820819641 +0000 UTC m=+19.420007302,LastTimestamp:2026-03-18 15:38:04.820534099 +0000 UTC m=+49.419721740,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:38:11 crc kubenswrapper[4939]: > Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.824409 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.824705 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.826114 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.826188 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.826208 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:11 crc kubenswrapper[4939]: I0318 15:38:11.828414 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.054570 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:12 crc kubenswrapper[4939]: E0318 15:38:12.391660 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.403227 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.404651 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.404698 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.404719 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.404753 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:38:12 crc kubenswrapper[4939]: E0318 15:38:12.409797 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.447959 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.449477 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.449529 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:12 crc kubenswrapper[4939]: I0318 15:38:12.449541 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.054594 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.452603 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.453560 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.455898 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" exitCode=255 Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.455950 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550"} Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.456000 4939 scope.go:117] "RemoveContainer" containerID="9155f8caddcca8a4f275f034d10729f9d0a25c29047bfa854afa43c578621ace" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.456179 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.457401 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.457583 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.457597 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:13 crc kubenswrapper[4939]: I0318 15:38:13.458583 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:13 crc kubenswrapper[4939]: E0318 15:38:13.458824 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:14 crc kubenswrapper[4939]: I0318 15:38:14.052028 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:14 crc kubenswrapper[4939]: I0318 15:38:14.460884 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:38:15 crc kubenswrapper[4939]: I0318 15:38:15.052910 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.053127 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:16 crc kubenswrapper[4939]: E0318 15:38:16.228228 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.833966 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.834180 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.835305 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.835336 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.835345 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:16 crc kubenswrapper[4939]: I0318 15:38:16.835894 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:16 crc kubenswrapper[4939]: E0318 15:38:16.836128 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.051561 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.605318 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.606434 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.613886 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.613983 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.614012 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:17 crc kubenswrapper[4939]: I0318 15:38:17.615361 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:17 crc kubenswrapper[4939]: E0318 15:38:17.615767 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:18 crc kubenswrapper[4939]: I0318 15:38:18.052138 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.051990 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:19 crc kubenswrapper[4939]: E0318 15:38:19.397928 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.410234 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.411756 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.411788 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.411797 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:19 crc kubenswrapper[4939]: I0318 15:38:19.411821 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:38:19 crc kubenswrapper[4939]: E0318 15:38:19.416725 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:38:20 crc kubenswrapper[4939]: I0318 15:38:20.053684 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:21 crc kubenswrapper[4939]: I0318 15:38:21.051923 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:21 crc kubenswrapper[4939]: I0318 15:38:21.132496 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:21 crc kubenswrapper[4939]: I0318 15:38:21.134182 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:21 crc kubenswrapper[4939]: I0318 15:38:21.134297 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:21 crc kubenswrapper[4939]: I0318 15:38:21.134374 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:22 crc kubenswrapper[4939]: I0318 15:38:22.051695 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:23 crc kubenswrapper[4939]: I0318 15:38:23.053215 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:24 crc kubenswrapper[4939]: I0318 15:38:24.054145 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:25 crc kubenswrapper[4939]: I0318 15:38:25.055572 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.053393 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:26 crc kubenswrapper[4939]: E0318 15:38:26.228404 4939 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:38:26 crc kubenswrapper[4939]: E0318 15:38:26.402338 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.417224 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.418478 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.418549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.418566 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:26 crc kubenswrapper[4939]: I0318 15:38:26.418599 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:38:26 crc kubenswrapper[4939]: E0318 15:38:26.424835 4939 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:38:27 crc kubenswrapper[4939]: I0318 15:38:27.053703 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:27 crc kubenswrapper[4939]: I0318 15:38:27.821886 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:38:27 crc kubenswrapper[4939]: I0318 15:38:27.846163 4939 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:38:28 crc kubenswrapper[4939]: I0318 15:38:28.056934 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.053363 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.132431 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.133635 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.133694 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.133716 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:29 crc kubenswrapper[4939]: I0318 15:38:29.134763 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:29 crc kubenswrapper[4939]: E0318 15:38:29.135125 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:30 crc kubenswrapper[4939]: I0318 15:38:30.053952 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.053691 4939 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.205420 4939 csr.go:261] certificate signing request csr-dqns8 is approved, waiting to be issued Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.289849 4939 csr.go:257] certificate signing request csr-dqns8 is issued Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.350075 4939 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.351992 4939 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:38:31 crc kubenswrapper[4939]: I0318 15:38:31.877261 4939 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 15:38:31 crc kubenswrapper[4939]: W0318 15:38:31.877493 4939 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.072070 4939 apiserver.go:52] "Watching apiserver" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.083129 4939 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.083596 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.084154 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.084265 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.084158 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.084417 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.084739 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.085288 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.085015 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.084968 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.085476 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.087526 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.087661 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.087690 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.087890 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.087891 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.088196 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.088888 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.091643 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.091885 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.122034 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.140054 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.153034 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.159957 4939 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.171631 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.184683 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.196975 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.209688 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255187 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255263 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255311 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255344 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255379 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255412 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255443 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255582 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255614 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255646 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255676 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255741 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255775 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255828 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255858 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255890 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255922 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255959 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.255990 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256019 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256047 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256081 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256115 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256185 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256217 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256251 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256287 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256325 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256315 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256367 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256371 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256405 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256540 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256572 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256609 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256623 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256636 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256683 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256709 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256734 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256758 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256777 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256796 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256816 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256838 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256867 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256901 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256925 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256970 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256992 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257014 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257038 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257065 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257086 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257139 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257171 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257198 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257254 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257280 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257403 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257436 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257465 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257495 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257542 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257570 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257597 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257623 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257646 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257671 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257691 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257722 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257745 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257772 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257795 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257820 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257848 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257875 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257898 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257926 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257956 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257980 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258004 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258028 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258054 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258075 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258098 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258119 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258142 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258166 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258193 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258226 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258251 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258277 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258302 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258327 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258350 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258377 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258400 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258423 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258445 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258467 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258492 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258535 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258559 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258584 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258608 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258635 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258657 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258685 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258708 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258731 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258752 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258780 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258822 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258848 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258871 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258894 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258919 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258951 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258976 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258999 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259026 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259056 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259084 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259137 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259161 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259189 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259213 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259240 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259268 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259291 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259319 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259342 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259366 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259389 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259412 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259441 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259466 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259492 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259662 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259694 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259751 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259779 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259803 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259826 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259849 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259878 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259922 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259972 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259997 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260022 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260048 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260073 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260203 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260228 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260254 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260278 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256812 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.256872 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260392 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257212 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260422 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257288 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257444 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257499 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257472 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260451 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260634 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260662 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260694 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260717 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260741 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260763 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260790 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260817 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260842 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260866 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260891 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260918 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260992 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261018 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261040 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261064 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261088 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261113 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261143 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261166 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261190 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261216 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261244 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261274 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261302 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261330 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261355 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261561 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261592 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261646 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261677 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261701 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261731 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261762 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261788 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261815 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261844 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261871 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261897 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261920 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261940 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261961 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261983 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262019 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262034 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262047 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262060 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262072 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262087 4939 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262100 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262118 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260580 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257573 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257859 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.265374 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258052 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258083 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258193 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258173 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258232 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.257985 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258540 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258634 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.258915 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259534 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259547 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259571 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259950 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.259972 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260147 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.260753 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261196 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261596 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261570 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.261755 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.262534 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263124 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263247 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263310 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263599 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263927 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263925 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.263946 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.264185 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:32.764105381 +0000 UTC m=+77.363293202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.264477 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.264019 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.264712 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.264909 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.264992 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.265247 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.265300 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.265521 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.265841 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.266078 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.266438 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.266930 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.267186 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.267312 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.267460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.267820 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.268554 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.268636 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.268829 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.268865 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269024 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.268591 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269151 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269343 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269260 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269387 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.269464 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.270246 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.274626 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275100 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275214 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275287 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275412 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275559 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275483 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275661 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.275893 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.276143 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.276171 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.276350 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.276416 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.276443 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:32.776418663 +0000 UTC m=+77.375606274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.276358 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.276544 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277312 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277347 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.277483 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277551 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.277609 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:32.777585788 +0000 UTC m=+77.376773529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277728 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277935 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.277941 4939 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.278084 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.278155 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.278177 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279251 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279430 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279609 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279618 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279227 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.279939 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280008 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280056 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280360 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280560 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280604 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.280627 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281089 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281099 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281393 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281698 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281837 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.281873 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.282022 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.282269 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.282368 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.283213 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.283224 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.283329 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.283685 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.284786 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.284849 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.284882 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.285022 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:32.784980236 +0000 UTC m=+77.384168047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.288096 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.288688 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.291148 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-26 02:49:55.485927232 +0000 UTC Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.291207 4939 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6779h11m23.194725193s for next certificate rotation Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.291278 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.291413 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.291440 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.291460 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.291561 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:32.791538879 +0000 UTC m=+77.390726500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.292127 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.295856 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.298226 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.299224 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.300647 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.300717 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.300779 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.301106 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.301238 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.301389 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.302113 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.302151 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.302228 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.302746 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303472 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303639 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303710 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303792 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.303944 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304172 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304210 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304308 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304321 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304649 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.304754 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.305104 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.305262 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.305447 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.305475 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.305563 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306168 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306267 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306446 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306495 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306714 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306722 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306760 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306776 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.306892 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307075 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307328 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307385 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307441 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307472 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307546 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307649 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307744 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.307998 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308158 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308593 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308614 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308861 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308886 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308868 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.308933 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.309341 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.309589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.309755 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.309827 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.310177 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.310357 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.310589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.310657 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.310956 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311019 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311070 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311144 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311189 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311477 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.311639 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.329911 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.336826 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.343231 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.346408 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362725 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362830 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362902 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362916 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362927 4939 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362938 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362949 4939 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362960 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362973 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.362990 4939 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363001 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363013 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363023 4939 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363035 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363046 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363056 4939 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363070 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363080 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363091 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363100 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363110 4939 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363119 4939 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363129 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363139 4939 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363148 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363161 4939 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363170 4939 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363180 4939 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363192 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363204 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363217 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363227 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363237 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363249 4939 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363260 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363271 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363281 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363293 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363306 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363317 4939 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363327 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363338 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363347 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363357 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363370 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363381 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363391 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363401 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363412 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363422 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363434 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363444 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363453 4939 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363462 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363472 4939 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363483 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363493 4939 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363528 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363541 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363552 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363564 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363588 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363600 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363612 4939 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363622 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363633 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363673 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363683 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363693 4939 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363702 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363713 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363723 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363732 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363743 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363755 4939 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363765 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363775 4939 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363785 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363794 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363804 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363813 4939 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363822 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363831 4939 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363841 4939 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363851 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363861 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363873 4939 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363882 4939 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363891 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363937 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363965 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.363993 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364004 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364015 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364025 4939 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364036 4939 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364045 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364055 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364065 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364075 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364084 4939 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364096 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364107 4939 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364117 4939 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364128 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364137 4939 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364148 4939 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364156 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364167 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364177 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364187 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364073 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364197 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364259 4939 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364278 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364293 4939 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364312 4939 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364325 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364338 4939 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364351 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364365 4939 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364377 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364390 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364402 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364413 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364425 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364439 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364451 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364463 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364481 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364495 4939 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364530 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364543 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364556 4939 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364567 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364579 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364591 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364602 4939 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364613 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364625 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364636 4939 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364647 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364660 4939 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364671 4939 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364685 4939 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364697 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364709 4939 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364723 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364734 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364745 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364756 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364768 4939 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364781 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364796 4939 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364805 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364814 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364822 4939 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364831 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364839 4939 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364848 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364856 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364865 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364874 4939 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364882 4939 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364890 4939 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364899 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364908 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364918 4939 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364926 4939 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364935 4939 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364944 4939 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364953 4939 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364961 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364970 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364978 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364987 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.364996 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365005 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365014 4939 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365022 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365032 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365040 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365049 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365058 4939 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365067 4939 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365076 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365088 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365099 4939 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365109 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.365118 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.405164 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.422214 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.427574 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f "/env/_master" ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: source "/env/_master" Mar 18 15:38:32 crc kubenswrapper[4939]: set +o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 15:38:32 crc kubenswrapper[4939]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 15:38:32 crc kubenswrapper[4939]: ho_enable="--enable-hybrid-overlay" Mar 18 15:38:32 crc kubenswrapper[4939]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 15:38:32 crc kubenswrapper[4939]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 15:38:32 crc kubenswrapper[4939]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-host=127.0.0.1 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-port=9743 \ Mar 18 15:38:32 crc kubenswrapper[4939]: ${ho_enable} \ Mar 18 15:38:32 crc kubenswrapper[4939]: --enable-interconnect \ Mar 18 15:38:32 crc kubenswrapper[4939]: --disable-approver \ Mar 18 15:38:32 crc kubenswrapper[4939]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --wait-for-kubernetes-api=200s \ Mar 18 15:38:32 crc kubenswrapper[4939]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --loglevel="${LOGLEVEL}" Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.431299 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f "/env/_master" ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: source "/env/_master" Mar 18 15:38:32 crc kubenswrapper[4939]: set +o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: Mar 18 15:38:32 crc kubenswrapper[4939]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --disable-webhook \ Mar 18 15:38:32 crc kubenswrapper[4939]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --loglevel="${LOGLEVEL}" Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.433546 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.433630 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.441579 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: source /etc/kubernetes/apiserver-url.env Mar 18 15:38:32 crc kubenswrapper[4939]: else Mar 18 15:38:32 crc kubenswrapper[4939]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 15:38:32 crc kubenswrapper[4939]: exit 1 Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.443734 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.459899 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.461660 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.511058 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8de365da69941b11aa96f9e77515759bfb3f9f781d6c5ab7af7e769074f2394b"} Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.513111 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.513494 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a3f64a503b09842f7d2a6efb4a72e809fc20e3ae3166644a9fa7a0992d219995"} Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.514430 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ee2ade968e0316611698daa6357fc27f755d098d66e5a859091935ed5e936152"} Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.514486 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.515337 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: source /etc/kubernetes/apiserver-url.env Mar 18 15:38:32 crc kubenswrapper[4939]: else Mar 18 15:38:32 crc kubenswrapper[4939]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 15:38:32 crc kubenswrapper[4939]: exit 1 Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.516398 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.518319 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f "/env/_master" ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: source "/env/_master" Mar 18 15:38:32 crc kubenswrapper[4939]: set +o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 15:38:32 crc kubenswrapper[4939]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 15:38:32 crc kubenswrapper[4939]: ho_enable="--enable-hybrid-overlay" Mar 18 15:38:32 crc kubenswrapper[4939]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 15:38:32 crc kubenswrapper[4939]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 15:38:32 crc kubenswrapper[4939]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-host=127.0.0.1 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --webhook-port=9743 \ Mar 18 15:38:32 crc kubenswrapper[4939]: ${ho_enable} \ Mar 18 15:38:32 crc kubenswrapper[4939]: --enable-interconnect \ Mar 18 15:38:32 crc kubenswrapper[4939]: --disable-approver \ Mar 18 15:38:32 crc kubenswrapper[4939]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --wait-for-kubernetes-api=200s \ Mar 18 15:38:32 crc kubenswrapper[4939]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --loglevel="${LOGLEVEL}" Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.520489 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:32 crc kubenswrapper[4939]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 15:38:32 crc kubenswrapper[4939]: if [[ -f "/env/_master" ]]; then Mar 18 15:38:32 crc kubenswrapper[4939]: set -o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: source "/env/_master" Mar 18 15:38:32 crc kubenswrapper[4939]: set +o allexport Mar 18 15:38:32 crc kubenswrapper[4939]: fi Mar 18 15:38:32 crc kubenswrapper[4939]: Mar 18 15:38:32 crc kubenswrapper[4939]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 15:38:32 crc kubenswrapper[4939]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 15:38:32 crc kubenswrapper[4939]: --disable-webhook \ Mar 18 15:38:32 crc kubenswrapper[4939]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 15:38:32 crc kubenswrapper[4939]: --loglevel="${LOGLEVEL}" Mar 18 15:38:32 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 15:38:32 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.521700 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.524032 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.537089 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.547608 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.558086 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.570364 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.580975 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.589361 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.598171 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.612165 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.624227 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.635589 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.649604 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.768289 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.768595 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:33.768568006 +0000 UTC m=+78.367755637 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.869445 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.869551 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.869588 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:32 crc kubenswrapper[4939]: I0318 15:38:32.869615 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869683 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869743 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869691 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869752 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869763 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869783 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869747 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869825 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869809 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:33.869786909 +0000 UTC m=+78.468974530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869878 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:33.869858611 +0000 UTC m=+78.469046232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869894 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:33.869885412 +0000 UTC m=+78.469073033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:32 crc kubenswrapper[4939]: E0318 15:38:32.869909 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:33.869902462 +0000 UTC m=+78.469090083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.425941 4939 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.428126 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.428188 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.428215 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.428321 4939 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.441368 4939 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.441672 4939 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.443937 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.444190 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.444340 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.444531 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.444697 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.468899 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.475203 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.475344 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.475445 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.475498 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.475567 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.491848 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.504760 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.504811 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.504840 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.504901 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.504923 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.537170 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.542579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.542923 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.543190 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.543301 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.543391 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.560274 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.565443 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.565639 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.565765 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.565896 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.566031 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.580487 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.580982 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.582980 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.583121 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.583221 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.583311 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.583408 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.686624 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.687111 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.687149 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.687175 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.687193 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.778357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.778609 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:35.778584251 +0000 UTC m=+80.377771912 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.791080 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.791159 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.791178 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.791206 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.791224 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.879320 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.879419 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.879488 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.879589 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879775 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879801 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879820 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879848 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879892 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:35.879867956 +0000 UTC m=+80.479055617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879936 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:35.879911337 +0000 UTC m=+80.479099008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879946 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879970 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.880119 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:35.880085642 +0000 UTC m=+80.479273303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.879997 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.880174 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:33 crc kubenswrapper[4939]: E0318 15:38:33.880278 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:35.880240117 +0000 UTC m=+80.479427768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.894037 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.894114 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.894125 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.894145 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.894157 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.996984 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.997143 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.997160 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.997179 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:33 crc kubenswrapper[4939]: I0318 15:38:33.997193 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:33Z","lastTransitionTime":"2026-03-18T15:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.100084 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.100113 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.100121 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.100136 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.100145 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.133919 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.134053 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:34 crc kubenswrapper[4939]: E0318 15:38:34.134173 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:34 crc kubenswrapper[4939]: E0318 15:38:34.134318 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.134618 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:34 crc kubenswrapper[4939]: E0318 15:38:34.134750 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.140663 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.142338 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.145274 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.146704 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.148585 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.150543 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.152668 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.155030 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.157370 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.159542 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.160632 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.162931 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.163985 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.164690 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.166027 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.166748 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.168007 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.168566 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.169281 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.170630 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.171215 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.172919 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.173461 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.174905 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.175448 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.177135 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.178687 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.179285 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.180647 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.181227 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.182297 4939 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.182430 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.184556 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.185827 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.186405 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.189221 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.191074 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.193254 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.195197 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.197575 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.198952 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.201997 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.204537 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.204781 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.204932 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.205084 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.205178 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.205344 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.206072 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.207546 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.208278 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.209540 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.210588 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.211683 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.212220 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.212950 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.213993 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.214647 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.215602 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.308354 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.308695 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.308854 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.309005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.309140 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.412064 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.412110 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.412121 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.412138 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.412150 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.515046 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.515709 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.515745 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.515768 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.515784 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.619855 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.619939 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.619959 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.619991 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.620011 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.723805 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.723875 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.723899 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.723929 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.723952 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.827304 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.827359 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.827377 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.827402 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.827422 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.930265 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.930805 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.930986 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.931141 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:34 crc kubenswrapper[4939]: I0318 15:38:34.931275 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:34Z","lastTransitionTime":"2026-03-18T15:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.034230 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.034311 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.034332 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.034364 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.034385 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.137321 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.137413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.137431 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.137458 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.137476 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.241619 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.241692 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.241715 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.241750 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.241774 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.344998 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.345059 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.345075 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.345104 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.345121 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.447310 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.447413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.447440 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.447479 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.447556 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.551833 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.551911 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.551940 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.551970 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.551995 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.656641 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.656713 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.656731 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.656760 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.656776 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.760950 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.761005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.761023 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.761050 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.761068 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.803076 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.803256 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:39.803218087 +0000 UTC m=+84.402405768 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.863696 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.863771 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.863795 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.863826 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.863856 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.904177 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.904287 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.904348 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904392 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.904402 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904544 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:39.904481071 +0000 UTC m=+84.503668722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904573 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904627 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904651 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904664 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904691 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904754 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904755 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:39.904689857 +0000 UTC m=+84.503877508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904776 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904809 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:39.90479236 +0000 UTC m=+84.503980011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:35 crc kubenswrapper[4939]: E0318 15:38:35.904889 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:39.904862002 +0000 UTC m=+84.504049713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.967205 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.967259 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.967278 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.967302 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:35 crc kubenswrapper[4939]: I0318 15:38:35.967319 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:35Z","lastTransitionTime":"2026-03-18T15:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.070621 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.070683 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.070708 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.070739 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.070763 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.132605 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:36 crc kubenswrapper[4939]: E0318 15:38:36.132843 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.132910 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.133003 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:36 crc kubenswrapper[4939]: E0318 15:38:36.133178 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:36 crc kubenswrapper[4939]: E0318 15:38:36.133479 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.153934 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.169487 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.174203 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.174406 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.174624 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.174780 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.174900 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.185767 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.204644 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.222332 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.240445 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.277572 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.278090 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.278256 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.278416 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.278639 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.339263 4939 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.381612 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.381735 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.381756 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.381789 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.381808 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.485220 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.485339 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.485363 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.485390 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.485405 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.588255 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.588309 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.588323 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.588346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.588362 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.699127 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.699173 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.699187 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.699206 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.699221 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.801933 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.801983 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.801996 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.802013 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.802026 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.904472 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.904562 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.904576 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.904595 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:36 crc kubenswrapper[4939]: I0318 15:38:36.904609 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:36Z","lastTransitionTime":"2026-03-18T15:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.007977 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.008018 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.008030 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.008049 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.008061 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.111315 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.111373 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.111415 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.111440 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.111462 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.214939 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.215013 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.215032 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.215060 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.215078 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.318173 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.318667 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.318865 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.319086 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.319268 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.421161 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.421397 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.421535 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.421614 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.421678 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.525093 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.525174 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.525201 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.525233 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.525257 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.628282 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.628360 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.628377 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.628407 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.628432 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.731488 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.731595 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.731614 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.731645 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.731663 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.835295 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.835384 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.835409 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.835440 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.835467 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.938108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.938492 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.938596 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.938704 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:37 crc kubenswrapper[4939]: I0318 15:38:37.938785 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:37Z","lastTransitionTime":"2026-03-18T15:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.042102 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.042146 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.042161 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.042183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.042198 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.133167 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.133195 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.133303 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:38 crc kubenswrapper[4939]: E0318 15:38:38.133469 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:38 crc kubenswrapper[4939]: E0318 15:38:38.133693 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:38 crc kubenswrapper[4939]: E0318 15:38:38.133871 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.144879 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.144935 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.144959 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.144987 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.145009 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.160232 4939 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.248549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.248616 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.248635 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.248661 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.248678 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.351670 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.351760 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.351774 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.351796 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.351811 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.456843 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.456897 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.456907 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.456926 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.456939 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.560099 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.560162 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.560181 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.560207 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.560225 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.662951 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.663018 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.663038 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.663068 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.663087 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.766667 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.766747 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.766766 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.766793 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.766810 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.870568 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.870637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.870659 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.870697 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.870716 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.974267 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.974347 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.974366 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.974397 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:38 crc kubenswrapper[4939]: I0318 15:38:38.974423 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:38Z","lastTransitionTime":"2026-03-18T15:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.077091 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.077157 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.077167 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.077187 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.077200 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.180684 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.180776 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.180801 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.180841 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.180865 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.284559 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.284647 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.284668 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.284700 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.284726 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.388479 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.388602 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.388624 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.388657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.388677 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.492803 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.492889 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.492909 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.492941 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.492963 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.595479 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.595574 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.595592 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.595618 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.595636 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.698849 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.698925 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.698940 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.698966 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.698983 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.802139 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.802185 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.802196 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.802212 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.802225 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.840986 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.841198 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:38:47.841158415 +0000 UTC m=+92.440346046 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.905559 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.905637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.905652 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.905673 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.905687 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:39Z","lastTransitionTime":"2026-03-18T15:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.942349 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.942424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.942473 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:39 crc kubenswrapper[4939]: I0318 15:38:39.942573 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.942692 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.942794 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:47.942759079 +0000 UTC m=+92.541946740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943470 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943542 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943570 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943701 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:47.943670185 +0000 UTC m=+92.542857816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943712 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.943861 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:47.94384209 +0000 UTC m=+92.543029751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.944213 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.944432 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.944669 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:39 crc kubenswrapper[4939]: E0318 15:38:39.944954 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:38:47.944915072 +0000 UTC m=+92.544102893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.009079 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.009171 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.009196 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.009226 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.009243 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.112870 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.113217 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.113369 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.113476 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.113572 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.132652 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.132761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:40 crc kubenswrapper[4939]: E0318 15:38:40.132932 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:40 crc kubenswrapper[4939]: E0318 15:38:40.133237 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.133435 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:40 crc kubenswrapper[4939]: E0318 15:38:40.133862 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.217473 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.217561 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.217579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.217607 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.217626 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.320788 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.320836 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.320845 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.320865 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.320875 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.423391 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.423486 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.423551 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.423591 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.423658 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.526974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.527109 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.527129 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.527194 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.527214 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.631057 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.631169 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.631193 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.631255 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.631277 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.734615 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.734702 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.734721 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.734754 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.734776 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.837778 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.837827 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.837839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.837857 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.837874 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.940953 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.940999 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.941010 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.941028 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:40 crc kubenswrapper[4939]: I0318 15:38:40.941038 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:40Z","lastTransitionTime":"2026-03-18T15:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.043495 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.043563 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.043578 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.043596 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.043609 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.146584 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.146663 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.146688 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.146720 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.146742 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.253994 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.254059 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.254089 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.254117 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.254135 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.358350 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.358724 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.358923 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.359086 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.359253 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.462238 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.462557 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.462693 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.462854 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.462968 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.566156 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.566466 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.566940 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.567431 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.567858 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.672054 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.672240 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.672261 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.672285 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.672303 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.775071 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.775842 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.776140 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.776411 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.776899 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.881248 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.881309 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.881328 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.881355 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.881377 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.984328 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.984884 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.984990 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.985077 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:41 crc kubenswrapper[4939]: I0318 15:38:41.985164 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:41Z","lastTransitionTime":"2026-03-18T15:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.087919 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.088001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.088025 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.088055 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.088078 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.132650 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.132881 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.132650 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:42 crc kubenswrapper[4939]: E0318 15:38:42.133105 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:42 crc kubenswrapper[4939]: E0318 15:38:42.133497 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:42 crc kubenswrapper[4939]: E0318 15:38:42.133678 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.154284 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.155311 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:42 crc kubenswrapper[4939]: E0318 15:38:42.156084 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.191333 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.191698 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.191916 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.192084 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.192222 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.295535 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.295846 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.295958 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.296080 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.296182 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.398953 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.399302 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.399367 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.399443 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.399585 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.501758 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.501804 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.501816 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.501833 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.501847 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.546161 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:42 crc kubenswrapper[4939]: E0318 15:38:42.546803 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.605601 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.605919 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.606038 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.606190 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.606306 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.708458 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.708657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.708679 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.708698 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.708707 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.811371 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.811440 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.811464 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.811497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.811543 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.914338 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.914603 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.914669 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.914754 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:42 crc kubenswrapper[4939]: I0318 15:38:42.914811 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:42Z","lastTransitionTime":"2026-03-18T15:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.016853 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.016907 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.016919 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.016939 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.016952 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.119765 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.119812 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.119824 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.119841 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.119853 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.223091 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.223155 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.223174 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.223201 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.223225 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.325701 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.325779 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.325804 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.325837 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.325862 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.429342 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.429390 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.429409 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.429434 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.429454 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.531908 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.531965 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.531982 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.532006 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.532023 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.635129 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.635181 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.635198 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.635224 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.635246 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.698285 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.698347 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.698371 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.698403 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.698423 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.711865 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.716734 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.716786 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.716803 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.716831 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.716850 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.733939 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.739395 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.739441 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.739459 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.739482 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.739499 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.756213 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.760340 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.760409 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.760429 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.760453 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.760472 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.776546 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.780853 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.780959 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.781027 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.781088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.781148 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.794993 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:43 crc kubenswrapper[4939]: E0318 15:38:43.795319 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.797433 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.797562 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.797638 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.797710 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.797800 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.900679 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.900771 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.900820 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.900846 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:43 crc kubenswrapper[4939]: I0318 15:38:43.900865 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:43Z","lastTransitionTime":"2026-03-18T15:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.003731 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.004294 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.004489 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.004852 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.005042 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.107711 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.107775 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.107794 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.107823 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.107844 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.132245 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.132376 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.132568 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:44 crc kubenswrapper[4939]: E0318 15:38:44.132780 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:44 crc kubenswrapper[4939]: E0318 15:38:44.132950 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:44 crc kubenswrapper[4939]: E0318 15:38:44.133085 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.151789 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.210097 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.210144 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.210152 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.210169 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.210183 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.313848 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.313913 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.313925 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.313943 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.313955 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.416941 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.417005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.417018 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.417040 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.417054 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.520090 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.520165 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.520190 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.520220 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.520241 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.623046 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.623114 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.623129 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.623151 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.623172 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.727009 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.727061 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.727073 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.727095 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.727106 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.830925 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.830971 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.830983 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.831000 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.831013 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.933188 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.933229 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.933241 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.933257 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:44 crc kubenswrapper[4939]: I0318 15:38:44.933270 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:44Z","lastTransitionTime":"2026-03-18T15:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.036377 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.036446 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.036469 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.036498 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.036555 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.139344 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.139425 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.139531 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.139567 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.139782 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.242896 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.242941 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.242951 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.242969 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.242981 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.345593 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.345634 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.345643 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.345661 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.345672 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.448396 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.448444 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.448455 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.448469 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.448476 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.551024 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.551076 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.551088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.551108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.551123 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.657212 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.657258 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.657269 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.657284 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.657297 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.760616 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.760674 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.760691 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.760711 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.760725 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.863391 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.863448 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.863463 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.863480 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.863490 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.966354 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.966423 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.966441 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.966467 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:45 crc kubenswrapper[4939]: I0318 15:38:45.966488 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:45Z","lastTransitionTime":"2026-03-18T15:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.069410 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.069470 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.069486 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.069528 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.069541 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.132834 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.133084 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:46 crc kubenswrapper[4939]: E0318 15:38:46.133087 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:46 crc kubenswrapper[4939]: E0318 15:38:46.133181 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.133262 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:46 crc kubenswrapper[4939]: E0318 15:38:46.133332 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.146876 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.158079 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.174649 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.174762 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.174842 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.174914 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.174937 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.179435 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.206134 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.235130 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.260534 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.273983 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.277463 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.277518 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.277528 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.277549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.277561 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.293294 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.386367 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.386713 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.386785 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.386871 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.386951 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.490210 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.490276 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.490295 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.490321 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.490339 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.593278 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.593336 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.593354 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.593379 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.593399 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.695988 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.696052 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.696069 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.696097 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.696114 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.798969 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.799061 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.799087 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.799120 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.799146 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.902446 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.902536 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.902558 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.902584 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:46 crc kubenswrapper[4939]: I0318 15:38:46.902601 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:46Z","lastTransitionTime":"2026-03-18T15:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.005079 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.005176 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.005194 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.005221 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.005243 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.108744 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.108809 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.108827 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.108852 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.108875 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.211963 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.212040 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.212061 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.212088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.212109 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.315188 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.315678 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.315696 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.316130 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.316164 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.417845 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.417925 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.417949 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.417973 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.417990 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.519866 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.519901 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.519911 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.519926 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.519937 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.563335 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.567675 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.567702 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.581936 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.589326 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.600285 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.610093 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.619879 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.621931 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.622010 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.622026 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.622047 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.622060 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.642835 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.659027 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.671976 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.685844 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.698333 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.712904 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.724741 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.725445 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.725534 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.725564 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.725595 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.725623 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.736541 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.746326 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.762842 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.786299 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.827789 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.827858 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.827905 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.827929 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.827946 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.920733 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:38:47 crc kubenswrapper[4939]: E0318 15:38:47.920947 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:03.920914422 +0000 UTC m=+108.520102063 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.931187 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.931230 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.931245 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.931267 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:47 crc kubenswrapper[4939]: I0318 15:38:47.931283 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:47Z","lastTransitionTime":"2026-03-18T15:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.021753 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.021877 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.021936 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.021996 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022123 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022219 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:04.022193637 +0000 UTC m=+108.621381298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022434 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022652 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022722 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022726 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022831 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:04.022804225 +0000 UTC m=+108.621991866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.022862 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:04.022848546 +0000 UTC m=+108.622036177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.023011 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.023041 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.023071 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.023176 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:04.023136394 +0000 UTC m=+108.622324065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.034005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.034071 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.034088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.034115 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.034131 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.132676 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.132780 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.132703 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.132860 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.132952 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:48 crc kubenswrapper[4939]: E0318 15:38:48.133056 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.136657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.136691 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.136700 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.136714 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.136724 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.185368 4939 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.239167 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.239240 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.239262 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.239287 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.239306 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.342147 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.342192 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.342204 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.342221 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.342234 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.444027 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.444065 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.444074 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.444089 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.444108 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.546675 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.546732 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.546751 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.546778 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.546795 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.572268 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.591192 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.607740 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.620974 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.636568 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.649862 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.649897 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.649905 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.649919 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.649928 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.654143 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.669587 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.684787 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.701271 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:48Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.752822 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.752852 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.752861 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.752876 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.752886 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.855267 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.855312 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.855327 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.855345 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.855358 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.957771 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.957811 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.957823 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.957841 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:48 crc kubenswrapper[4939]: I0318 15:38:48.957851 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:48Z","lastTransitionTime":"2026-03-18T15:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.060336 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.060396 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.060409 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.060433 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.060447 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.163108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.163172 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.163192 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.163217 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.163236 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.265779 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.265827 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.265839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.265859 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.265869 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.368131 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.368175 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.368185 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.368202 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.368213 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.471179 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.471237 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.471258 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.471294 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.471313 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.573559 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.573597 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.573625 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.573642 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.573654 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.676244 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.676299 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.676314 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.676336 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.676354 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.779142 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.779200 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.779214 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.779233 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.779245 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.881471 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.881564 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.881617 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.881638 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.881651 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.984468 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.984604 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.984650 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.984686 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:49 crc kubenswrapper[4939]: I0318 15:38:49.984713 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:49Z","lastTransitionTime":"2026-03-18T15:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.087001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.087037 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.087048 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.087066 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.087079 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.132409 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.132566 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:50 crc kubenswrapper[4939]: E0318 15:38:50.132633 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:50 crc kubenswrapper[4939]: E0318 15:38:50.132770 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.132862 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:50 crc kubenswrapper[4939]: E0318 15:38:50.132943 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.190344 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.190402 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.190419 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.190443 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.190464 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.293058 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.293103 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.293117 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.293565 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.293595 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.401905 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.401956 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.401974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.401999 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.402017 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.504077 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.504108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.504119 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.504135 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.504145 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.607153 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.607203 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.607218 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.607238 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.607254 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.710524 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.710583 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.710604 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.710624 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.710638 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.813413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.813567 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.813589 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.813614 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.813633 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.915930 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.916297 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.916437 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.916673 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:50 crc kubenswrapper[4939]: I0318 15:38:50.916894 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:50Z","lastTransitionTime":"2026-03-18T15:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.020346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.020409 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.020421 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.020439 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.020469 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.123772 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.123837 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.123855 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.123880 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.123901 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.226850 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.227214 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.227351 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.227485 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.227682 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.330522 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.330553 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.330563 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.330576 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.330599 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.433197 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.433261 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.433275 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.433294 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.433307 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.536043 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.536118 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.536138 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.536165 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.536182 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.638848 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.638913 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.638935 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.638961 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.638978 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.742038 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.742100 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.742116 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.742159 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.742174 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.845069 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.845106 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.845115 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.845128 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.845136 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.948157 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.948207 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.948216 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.948236 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:51 crc kubenswrapper[4939]: I0318 15:38:51.948246 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:51Z","lastTransitionTime":"2026-03-18T15:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.051183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.051226 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.051235 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.051249 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.051258 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.132560 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.132665 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.132611 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:52 crc kubenswrapper[4939]: E0318 15:38:52.132833 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:52 crc kubenswrapper[4939]: E0318 15:38:52.132930 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:52 crc kubenswrapper[4939]: E0318 15:38:52.133122 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.154110 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.154149 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.154163 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.154201 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.154216 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.257280 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.257353 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.257419 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.257454 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.257478 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.360644 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.360695 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.360713 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.360736 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.360753 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.463865 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.463944 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.463967 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.463996 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.464017 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.566911 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.566974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.567011 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.567046 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.567075 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.669626 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.669692 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.669715 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.669752 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.669772 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.772644 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.772726 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.772752 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.772784 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.772805 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.875835 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.875881 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.875893 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.875912 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.875925 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.978275 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.978344 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.978364 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.978393 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:52 crc kubenswrapper[4939]: I0318 15:38:52.978414 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:52Z","lastTransitionTime":"2026-03-18T15:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.080538 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.080592 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.080604 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.080622 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.080638 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.183760 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.183812 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.183829 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.183852 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.183874 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.286831 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.286881 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.286893 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.286912 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.286927 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.358882 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4ptxp"] Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.359560 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6q7lf"] Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.360339 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.361121 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.364753 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.367171 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.368731 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.368779 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.368970 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.369020 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.369108 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.369343 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.389180 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.389239 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.389257 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.389280 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.389317 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.398263 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.413695 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.427391 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.455422 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477017 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67ddaef3-086a-4a4d-931d-c0e82663eb6a-hosts-file\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477100 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32d41a6-8ebb-4871-b660-91407cbaa5c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477165 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a32d41a6-8ebb-4871-b660-91407cbaa5c5-rootfs\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477210 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32d41a6-8ebb-4871-b660-91407cbaa5c5-proxy-tls\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477545 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljc2\" (UniqueName: \"kubernetes.io/projected/a32d41a6-8ebb-4871-b660-91407cbaa5c5-kube-api-access-wljc2\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.477672 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bxd\" (UniqueName: \"kubernetes.io/projected/67ddaef3-086a-4a4d-931d-c0e82663eb6a-kube-api-access-88bxd\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.485259 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.492422 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.492497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.492549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.492577 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.492595 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.504381 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.520858 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.536296 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.552499 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.565459 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579038 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88bxd\" (UniqueName: \"kubernetes.io/projected/67ddaef3-086a-4a4d-931d-c0e82663eb6a-kube-api-access-88bxd\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljc2\" (UniqueName: \"kubernetes.io/projected/a32d41a6-8ebb-4871-b660-91407cbaa5c5-kube-api-access-wljc2\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579165 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67ddaef3-086a-4a4d-931d-c0e82663eb6a-hosts-file\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579203 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32d41a6-8ebb-4871-b660-91407cbaa5c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579243 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a32d41a6-8ebb-4871-b660-91407cbaa5c5-rootfs\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579277 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32d41a6-8ebb-4871-b660-91407cbaa5c5-proxy-tls\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579321 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67ddaef3-086a-4a4d-931d-c0e82663eb6a-hosts-file\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.579562 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a32d41a6-8ebb-4871-b660-91407cbaa5c5-rootfs\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.582691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32d41a6-8ebb-4871-b660-91407cbaa5c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.593027 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.598232 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32d41a6-8ebb-4871-b660-91407cbaa5c5-proxy-tls\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.601170 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.601210 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.601223 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.601244 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.601260 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.602238 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bxd\" (UniqueName: \"kubernetes.io/projected/67ddaef3-086a-4a4d-931d-c0e82663eb6a-kube-api-access-88bxd\") pod \"node-resolver-4ptxp\" (UID: \"67ddaef3-086a-4a4d-931d-c0e82663eb6a\") " pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.611380 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.614150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljc2\" (UniqueName: \"kubernetes.io/projected/a32d41a6-8ebb-4871-b660-91407cbaa5c5-kube-api-access-wljc2\") pod \"machine-config-daemon-6q7lf\" (UID: \"a32d41a6-8ebb-4871-b660-91407cbaa5c5\") " pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.625567 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.658053 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.678804 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.686312 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.694876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4ptxp" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.696158 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.704796 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.704843 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.704853 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.704871 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.704882 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.711787 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: W0318 15:38:53.728266 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32d41a6_8ebb_4871_b660_91407cbaa5c5.slice/crio-a4e539501dbdd7148e01a6acc9cb555c7649a2a45445c324d11c36f537eee849 WatchSource:0}: Error finding container a4e539501dbdd7148e01a6acc9cb555c7649a2a45445c324d11c36f537eee849: Status 404 returned error can't find the container with id a4e539501dbdd7148e01a6acc9cb555c7649a2a45445c324d11c36f537eee849 Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.730914 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: W0318 15:38:53.735375 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ddaef3_086a_4a4d_931d_c0e82663eb6a.slice/crio-c91e71a6afbe47d8ad8163a1a5aa0c747af948592ab064bb88e3fb5b84c061f1 WatchSource:0}: Error finding container c91e71a6afbe47d8ad8163a1a5aa0c747af948592ab064bb88e3fb5b84c061f1: Status 404 returned error can't find the container with id c91e71a6afbe47d8ad8163a1a5aa0c747af948592ab064bb88e3fb5b84c061f1 Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.744282 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.758251 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.785009 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-x2ztl"] Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.785762 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xmzwg"] Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.786022 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.786388 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.792362 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.792650 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.793639 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.793828 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.793957 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.794274 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.794465 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.804317 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.808333 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.808383 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.808399 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.808425 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.808442 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.834638 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.848913 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.860595 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.860656 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.860670 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.860696 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.860714 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.871416 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.875779 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.879806 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.879839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.879847 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.879864 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.879875 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882430 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pzh4\" (UniqueName: \"kubernetes.io/projected/055d05da-715f-47bd-88a3-6a93965a2f65-kube-api-access-2pzh4\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882470 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-k8s-cni-cncf-io\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882490 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-hostroot\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882529 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-etc-kubernetes\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882569 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882585 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882606 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-system-cni-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-cnibin\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882641 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-netns\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-system-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882688 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-kubelet\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882710 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-os-release\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882729 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-daemon-config\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882747 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-cnibin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882765 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-cni-binary-copy\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882789 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-bin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882809 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-multus\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882827 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-multus-certs\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882850 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-os-release\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882870 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-binary-copy\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882889 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-socket-dir-parent\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882912 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9277t\" (UniqueName: \"kubernetes.io/projected/6693c593-9b18-435e-8a3a-91d3e33c3c51-kube-api-access-9277t\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882941 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.882959 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-conf-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.887058 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.893032 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.896497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.896549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.896560 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.896577 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.896587 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.902671 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.914766 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.916225 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.918958 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.919002 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.919016 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.919036 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.919048 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.931661 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.932421 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.936749 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.936813 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.936829 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.936849 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.936861 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.947902 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.950048 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: E0318 15:38:53.950210 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.952408 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.952439 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.952449 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.952468 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.952481 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:53Z","lastTransitionTime":"2026-03-18T15:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.967365 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.983917 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-hostroot\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.983966 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-etc-kubernetes\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984009 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984052 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-system-cni-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984071 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-cnibin\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984091 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-netns\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-hostroot\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984116 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-system-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984144 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-kubelet\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984200 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-kubelet\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984249 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-etc-kubernetes\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984253 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-os-release\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984285 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-daemon-config\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984309 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-cnibin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984328 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-cni-binary-copy\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984346 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-bin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984365 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-multus\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984388 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-multus-certs\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984411 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-os-release\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984430 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-binary-copy\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984450 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-socket-dir-parent\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984494 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9277t\" (UniqueName: \"kubernetes.io/projected/6693c593-9b18-435e-8a3a-91d3e33c3c51-kube-api-access-9277t\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984563 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984583 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-conf-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984610 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pzh4\" (UniqueName: \"kubernetes.io/projected/055d05da-715f-47bd-88a3-6a93965a2f65-kube-api-access-2pzh4\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984629 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-k8s-cni-cncf-io\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984703 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-k8s-cni-cncf-io\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.984898 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-os-release\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.985809 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-daemon-config\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.985875 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-cnibin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986200 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-bin\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986233 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-netns\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986233 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-cnibin\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986257 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-var-lib-cni-multus\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986289 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-host-run-multus-certs\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986331 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-os-release\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986455 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-system-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986456 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-system-cni-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986613 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-cni-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986730 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-conf-dir\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.986799 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6693c593-9b18-435e-8a3a-91d3e33c3c51-multus-socket-dir-parent\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.987552 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/055d05da-715f-47bd-88a3-6a93965a2f65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.987840 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.989848 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:53Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.989948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/055d05da-715f-47bd-88a3-6a93965a2f65-cni-binary-copy\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:53 crc kubenswrapper[4939]: I0318 15:38:53.990842 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6693c593-9b18-435e-8a3a-91d3e33c3c51-cni-binary-copy\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.004908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pzh4\" (UniqueName: \"kubernetes.io/projected/055d05da-715f-47bd-88a3-6a93965a2f65-kube-api-access-2pzh4\") pod \"multus-additional-cni-plugins-x2ztl\" (UID: \"055d05da-715f-47bd-88a3-6a93965a2f65\") " pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.006037 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9277t\" (UniqueName: \"kubernetes.io/projected/6693c593-9b18-435e-8a3a-91d3e33c3c51-kube-api-access-9277t\") pod \"multus-xmzwg\" (UID: \"6693c593-9b18-435e-8a3a-91d3e33c3c51\") " pod="openshift-multus/multus-xmzwg" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.007436 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.034348 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.049583 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.054594 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.054627 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.054641 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.054661 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.054677 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.062897 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.075217 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.090101 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.105576 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.122602 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.129183 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmzwg" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.132225 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.132302 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.132324 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.132371 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:54 crc kubenswrapper[4939]: E0318 15:38:54.132362 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:54 crc kubenswrapper[4939]: E0318 15:38:54.132466 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:54 crc kubenswrapper[4939]: E0318 15:38:54.132559 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.136905 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: W0318 15:38:54.152028 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6693c593_9b18_435e_8a3a_91d3e33c3c51.slice/crio-9f738e3ed86c4f4303b2dc202c0458e550fa6f1a86eac496768610628d834d23 WatchSource:0}: Error finding container 9f738e3ed86c4f4303b2dc202c0458e550fa6f1a86eac496768610628d834d23: Status 404 returned error can't find the container with id 9f738e3ed86c4f4303b2dc202c0458e550fa6f1a86eac496768610628d834d23 Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.155604 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.158559 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.158596 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.158611 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.158637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.158654 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.161341 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l79pv"] Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.163276 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.166354 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.166442 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.166537 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.166890 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.166914 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.167269 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.167292 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.171598 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.188623 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.202964 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.223680 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.241290 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.261400 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.261443 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.261454 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.261473 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.261485 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.264047 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.279880 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287787 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287816 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287837 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287853 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287867 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287891 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287907 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287936 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287951 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287984 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.287999 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288028 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288041 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288081 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p47dg\" (UniqueName: \"kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.288120 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.294438 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.308637 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.323779 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.343955 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.362858 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.364321 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.364387 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.364401 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.364419 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.364451 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.382130 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.388814 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.388904 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.388941 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389038 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389071 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389094 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389108 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389002 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389078 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389189 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389210 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389223 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389232 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389288 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389292 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389318 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389333 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389359 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389375 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389413 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389324 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389495 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389381 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389610 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389635 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389699 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389721 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389728 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389742 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p47dg\" (UniqueName: \"kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389799 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389824 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389865 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389877 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.389894 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.390021 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.390604 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.390669 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.390941 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.393986 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.395172 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.408818 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.408886 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p47dg\" (UniqueName: \"kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg\") pod \"ovnkube-node-l79pv\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.422053 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.467155 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.467201 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.467213 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.467232 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.467243 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.479522 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:38:54 crc kubenswrapper[4939]: W0318 15:38:54.494222 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacafcc67_568f_415b_b907_c1de4c851fa7.slice/crio-832f7543688668aa2a1e8e74acdb998d00385834fa22b85119476e6485cda5b9 WatchSource:0}: Error finding container 832f7543688668aa2a1e8e74acdb998d00385834fa22b85119476e6485cda5b9: Status 404 returned error can't find the container with id 832f7543688668aa2a1e8e74acdb998d00385834fa22b85119476e6485cda5b9 Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.578387 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.578547 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.578580 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.578614 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.578639 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.607596 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.607667 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"832f7543688668aa2a1e8e74acdb998d00385834fa22b85119476e6485cda5b9"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.610310 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerStarted","Data":"428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.610349 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerStarted","Data":"9f738e3ed86c4f4303b2dc202c0458e550fa6f1a86eac496768610628d834d23"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.614613 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4ptxp" event={"ID":"67ddaef3-086a-4a4d-931d-c0e82663eb6a","Type":"ContainerStarted","Data":"41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.614675 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4ptxp" event={"ID":"67ddaef3-086a-4a4d-931d-c0e82663eb6a","Type":"ContainerStarted","Data":"c91e71a6afbe47d8ad8163a1a5aa0c747af948592ab064bb88e3fb5b84c061f1"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.615377 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerStarted","Data":"e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.615409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerStarted","Data":"dd6d8e854642568612373539eb8c4767c9081f29feedd5b08f88d175ff910bab"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.617381 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.617410 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.617426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"a4e539501dbdd7148e01a6acc9cb555c7649a2a45445c324d11c36f537eee849"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.621553 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.632514 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.647379 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.666669 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.681640 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.683613 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.683645 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.683656 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.683672 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.683682 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.696393 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.710901 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.722809 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.735452 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.757543 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.772933 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786010 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786480 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786564 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786585 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786609 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.786628 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.802101 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.814995 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.832914 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.847698 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.877583 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.888725 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.888751 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.888760 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.888776 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.888788 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.915822 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.950116 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.991288 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.991322 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.991333 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.991346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:54 crc kubenswrapper[4939]: I0318 15:38:54.991356 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:54Z","lastTransitionTime":"2026-03-18T15:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.004391 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.067834 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.091947 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.095637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.095695 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.095710 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.095728 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.095741 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.114072 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.155015 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.195046 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.197843 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.197889 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.197903 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.197922 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.197935 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.237667 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.300687 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.300727 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.300738 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.300755 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.300766 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.402798 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.403250 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.403411 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.403599 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.403730 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.507186 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.507671 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.507686 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.507710 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.507729 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.609474 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.609707 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.609737 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.609755 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.609764 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638843 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" exitCode=0 Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638923 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638954 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638963 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638972 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638985 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.638996 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.640468 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18" exitCode=0 Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.640541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.655475 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.676964 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.690163 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.704891 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.716579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.716617 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.716626 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.716640 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.716652 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.717610 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.731165 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.748479 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.772161 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.788436 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.804883 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.818390 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.818432 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.818445 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.818461 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.818471 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.824909 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.843093 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.857383 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.921184 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.921277 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.921293 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.921315 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:55 crc kubenswrapper[4939]: I0318 15:38:55.921329 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:55Z","lastTransitionTime":"2026-03-18T15:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.024309 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.024347 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.024358 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.024375 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.024386 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.127040 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.127075 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.127085 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.127099 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.127111 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.132588 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:56 crc kubenswrapper[4939]: E0318 15:38:56.132672 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.132834 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.132891 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:56 crc kubenswrapper[4939]: E0318 15:38:56.133330 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:56 crc kubenswrapper[4939]: E0318 15:38:56.133449 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.133864 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.176547 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.200719 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.219197 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.231612 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.231663 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.231681 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.231705 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.231721 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.238908 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.262391 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.276060 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.289996 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.302173 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.312415 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.327783 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.333485 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.333520 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.333529 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.333542 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.333551 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.357414 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.372661 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.386815 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.427481 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-49sqv"] Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.427919 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.430160 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.430986 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.431262 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.431607 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.436310 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.436371 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.436391 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.436416 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.436433 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.445492 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.456330 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.474683 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.516031 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8626b206-a707-4a0f-b29f-b6fd365b1a89-serviceca\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.516177 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnf4\" (UniqueName: \"kubernetes.io/projected/8626b206-a707-4a0f-b29f-b6fd365b1a89-kube-api-access-4nnf4\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.516320 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8626b206-a707-4a0f-b29f-b6fd365b1a89-host\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.521785 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.539062 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.539114 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.539128 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.539146 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.539158 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.555979 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.600919 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.617760 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnf4\" (UniqueName: \"kubernetes.io/projected/8626b206-a707-4a0f-b29f-b6fd365b1a89-kube-api-access-4nnf4\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.617838 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8626b206-a707-4a0f-b29f-b6fd365b1a89-host\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.617903 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8626b206-a707-4a0f-b29f-b6fd365b1a89-serviceca\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.618027 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8626b206-a707-4a0f-b29f-b6fd365b1a89-host\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.619638 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8626b206-a707-4a0f-b29f-b6fd365b1a89-serviceca\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.637566 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.641553 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.641611 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.641630 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.641654 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.641673 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.648337 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.650348 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.652787 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.653284 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.656662 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3" exitCode=0 Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.656719 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.675264 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnf4\" (UniqueName: \"kubernetes.io/projected/8626b206-a707-4a0f-b29f-b6fd365b1a89-kube-api-access-4nnf4\") pod \"node-ca-49sqv\" (UID: \"8626b206-a707-4a0f-b29f-b6fd365b1a89\") " pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.701160 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.737129 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.742600 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49sqv" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.751187 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.751249 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.751271 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.751296 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.751320 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.774713 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.821961 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.853972 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.856424 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.856480 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.856497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.856540 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.856559 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.894809 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.936433 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.959666 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.959713 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.959728 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.959749 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.959765 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:56Z","lastTransitionTime":"2026-03-18T15:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:56 crc kubenswrapper[4939]: I0318 15:38:56.973061 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.017662 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.053087 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.062340 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.062417 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.062436 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.062465 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.062488 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.095084 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.133533 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.165393 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.165453 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.165471 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.165497 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.165547 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.188007 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.221137 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.255460 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.267303 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.267341 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.267350 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.267366 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.267376 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.294579 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.335885 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.369820 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.369860 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.369872 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.369893 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.369904 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.376875 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.415351 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.457271 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.471788 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.471829 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.471839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.471856 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.471870 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.493779 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.574907 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.574969 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.574981 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.575001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.575019 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.662155 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49sqv" event={"ID":"8626b206-a707-4a0f-b29f-b6fd365b1a89","Type":"ContainerStarted","Data":"537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.662234 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49sqv" event={"ID":"8626b206-a707-4a0f-b29f-b6fd365b1a89","Type":"ContainerStarted","Data":"71a22be90307e9b8097b920596a1fd6746f71aae15b56c65ea43dc0f86bbcf47"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.664589 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444" exitCode=0 Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.664672 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.678170 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.678235 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.678250 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.678271 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.678284 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.684082 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.722995 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.738287 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.753558 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.767102 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.777573 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.780940 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.780964 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.780974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.780989 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.780999 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.788873 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.812188 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.857671 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.884490 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.884558 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.884570 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.884589 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.884602 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.892230 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.938214 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.977738 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.987236 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.987272 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.987284 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.987302 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:57 crc kubenswrapper[4939]: I0318 15:38:57.987314 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:57Z","lastTransitionTime":"2026-03-18T15:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.016352 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.051199 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.089965 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.090011 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.090023 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.090039 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.090050 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.092973 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.132804 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.132846 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.132902 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:38:58 crc kubenswrapper[4939]: E0318 15:38:58.132925 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:38:58 crc kubenswrapper[4939]: E0318 15:38:58.133012 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:38:58 crc kubenswrapper[4939]: E0318 15:38:58.133081 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.134896 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.172247 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.191686 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.191729 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.191748 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.191766 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.191776 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.212700 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.251057 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.289188 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.294215 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.294270 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.294283 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.294302 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.294317 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.334950 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.379628 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.396959 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.397017 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.397036 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.397059 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.397078 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.417840 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.453421 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.494059 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.500569 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.500608 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.500617 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.500630 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.500639 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.546426 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.576440 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.603244 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.603307 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.603323 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.603342 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.603355 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.619757 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.681750 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.685281 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274" exitCode=0 Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.685333 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.700062 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.711747 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.711789 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.711798 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.711816 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.711827 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.720781 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.759934 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.777916 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.814760 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.814931 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.814964 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.814977 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.814995 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.815007 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.851961 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.896879 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.917522 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.917559 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.917575 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.917599 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.917613 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:58Z","lastTransitionTime":"2026-03-18T15:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.933316 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:58 crc kubenswrapper[4939]: I0318 15:38:58.976028 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.021194 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.021256 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.021265 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.021284 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.021296 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.023059 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.071959 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.104370 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.123743 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.123771 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.123779 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.123793 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.123801 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.134985 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.176210 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.225622 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.225652 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.225660 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.225674 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.225684 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.328291 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.328351 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.328362 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.328379 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.328391 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.431989 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.432024 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.432033 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.432047 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.432056 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.534645 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.534699 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.534710 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.534738 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.534755 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.639485 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.639557 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.639620 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.639646 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.639661 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.695433 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerStarted","Data":"b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.719771 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.738921 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.743162 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.743206 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.743224 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.743256 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.743279 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.757427 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.798007 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.824982 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.846140 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.846209 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.846222 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.846245 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.846260 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.847928 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.862918 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.877774 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.892362 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.911898 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.931175 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.948782 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.949933 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.949994 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.950014 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.950042 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.950062 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:38:59Z","lastTransitionTime":"2026-03-18T15:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.969953 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:38:59 crc kubenswrapper[4939]: I0318 15:38:59.986233 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:38:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.053597 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.053659 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.053673 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.053695 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.053713 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.132420 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.132565 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.132606 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:00 crc kubenswrapper[4939]: E0318 15:39:00.132816 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:00 crc kubenswrapper[4939]: E0318 15:39:00.133133 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:00 crc kubenswrapper[4939]: E0318 15:39:00.133280 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.156692 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.156859 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.156891 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.156974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.157047 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.266424 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.267108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.267133 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.267165 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.267184 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.369614 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.370215 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.370305 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.370391 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.370455 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.473695 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.473737 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.473748 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.473768 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.473783 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.575723 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.575758 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.575771 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.575789 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.575800 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.677957 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.678001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.678015 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.678036 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.678048 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.702147 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675" exitCode=0 Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.702210 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.706970 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.707786 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.707813 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.707826 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.724439 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.744152 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.745655 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.752285 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.768228 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.780825 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.781575 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.781632 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.781650 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.781674 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.781692 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.794413 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.815667 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.834525 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.857862 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.871460 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.884328 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.884368 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.884380 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.884399 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.884412 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.888593 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.905714 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.922277 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.937965 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.954811 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.981996 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.987088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.987132 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.987141 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.987156 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.987168 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:00Z","lastTransitionTime":"2026-03-18T15:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:00 crc kubenswrapper[4939]: I0318 15:39:00.999416 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:00Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.012062 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.025879 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.044908 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.068600 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.087577 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.089581 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.089624 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.089640 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.089698 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.089716 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.105874 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.118931 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.128804 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.146879 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.159383 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.171771 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.182611 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.193022 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.193093 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.193113 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.193144 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.193167 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.296677 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.296747 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.296764 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.296791 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.296807 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.398941 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.398979 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.398990 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.399007 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.399018 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.501534 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.501579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.501602 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.501625 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.501644 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.604560 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.604611 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.604627 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.604645 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.604661 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.707053 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.707095 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.707106 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.707122 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.707134 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.715268 4939 generic.go:334] "Generic (PLEG): container finished" podID="055d05da-715f-47bd-88a3-6a93965a2f65" containerID="dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70" exitCode=0 Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.715333 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerDied","Data":"dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.737877 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.754936 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.773425 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.785526 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.799220 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.809263 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.809333 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.809346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.809362 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.809373 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.814522 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.830465 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.842643 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.855187 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.875643 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.897213 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.912153 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.912383 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.912396 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.912414 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.912426 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:01Z","lastTransitionTime":"2026-03-18T15:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.913609 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.925901 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:01 crc kubenswrapper[4939]: I0318 15:39:01.942950 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.016355 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.016403 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.016413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.016432 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.016445 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.119410 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.119451 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.119459 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.119476 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.119488 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.133087 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.133114 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:02 crc kubenswrapper[4939]: E0318 15:39:02.133186 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:02 crc kubenswrapper[4939]: E0318 15:39:02.133280 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.133478 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:02 crc kubenswrapper[4939]: E0318 15:39:02.133579 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.224351 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.224388 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.224400 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.224416 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.224429 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.327085 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.327118 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.327128 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.327146 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.327163 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.429568 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.429617 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.429630 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.429650 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.429661 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.532012 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.532050 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.532060 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.532077 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.532089 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.634060 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.634092 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.634103 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.634116 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.634125 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.722172 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" event={"ID":"055d05da-715f-47bd-88a3-6a93965a2f65","Type":"ContainerStarted","Data":"152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.736934 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.736984 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.736994 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.737014 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.737028 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.738782 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.753980 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.769088 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.782336 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.798594 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.822801 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.839009 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.839045 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.839055 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.839072 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.839083 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.842021 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.855160 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.869531 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.891005 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.906763 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.921410 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.935535 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.942887 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.942958 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.942978 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.943007 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.943034 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:02Z","lastTransitionTime":"2026-03-18T15:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:02 crc kubenswrapper[4939]: I0318 15:39:02.956562 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.045969 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.045997 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.046009 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.046025 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.046036 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.148807 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.148869 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.148878 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.148896 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.148908 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.251259 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.251328 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.251346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.251373 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.251394 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.354336 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.354379 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.354391 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.354407 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.354418 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.458314 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.458394 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.458417 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.458447 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.458470 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.561152 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.561201 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.561213 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.561231 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.561244 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.664398 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.664466 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.664482 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.664532 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.664550 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.730470 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/0.log" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.735173 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303" exitCode=1 Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.735250 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.736358 4939 scope.go:117] "RemoveContainer" containerID="d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.759611 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.767217 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.767252 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.767268 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.767289 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.767304 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.791553 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.809933 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.825278 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.838733 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.854407 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.866413 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.872797 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.872849 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.872863 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.872881 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.872893 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.881330 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.902906 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:03Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:03.190491 6839 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190659 6839 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190672 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:39:03.190721 6839 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:39:03.190728 6839 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:39:03.190759 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:03.190815 6839 factory.go:656] Stopping watch factory\\\\nI0318 15:39:03.190849 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:03.190867 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:03.190875 6839 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:03.190882 6839 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.920258 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.938491 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.957070 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.973906 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.976666 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.976691 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.976700 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.976719 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.976733 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:03Z","lastTransitionTime":"2026-03-18T15:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:03 crc kubenswrapper[4939]: I0318 15:39:03.990598 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:03Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:03.999960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.000100 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:39:36.00007943 +0000 UTC m=+140.599267061 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.079579 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.079621 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.079636 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.079654 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.079665 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.100757 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.100828 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.100867 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.100910 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101018 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101034 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101061 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101079 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101018 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101128 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:36.101106738 +0000 UTC m=+140.700294399 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101188 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101250 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101271 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101209 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:36.10117851 +0000 UTC m=+140.700366231 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101362 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:36.101321404 +0000 UTC m=+140.700509055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.101409 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:39:36.101383326 +0000 UTC m=+140.700571177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.132308 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.132366 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.132418 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.132466 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.132651 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.132783 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.182320 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.182368 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.182379 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.182397 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.182618 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.266268 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.266315 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.266327 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.266345 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.266356 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.288479 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.294053 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.294088 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.294097 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.294113 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.294123 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.306600 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.310861 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.310913 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.310926 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.310945 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.310957 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.327172 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.332430 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.332479 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.332494 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.332533 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.332547 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.344883 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.349362 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.349427 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.349437 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.349455 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.349465 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.364011 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: E0318 15:39:04.364144 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.365974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.366020 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.366034 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.366060 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.366073 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.468700 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.468749 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.468763 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.468779 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.468793 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.571843 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.571887 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.571906 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.571929 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.571946 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.675109 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.675151 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.675160 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.675175 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.675188 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.741820 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/0.log" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.745618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.746205 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.776749 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.777095 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.777130 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.777139 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.777157 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.777174 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.800806 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.815897 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.843633 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.865222 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.880965 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.881040 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.881064 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.881101 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.881124 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.883740 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.906001 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:03Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:03.190491 6839 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190659 6839 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190672 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:39:03.190721 6839 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:39:03.190728 6839 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:39:03.190759 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:03.190815 6839 factory.go:656] Stopping watch factory\\\\nI0318 15:39:03.190849 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:03.190867 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:03.190875 6839 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:03.190882 6839 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.924966 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.940692 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.952050 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.961747 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.973402 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.983655 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.983697 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.983710 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.983731 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.983742 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:04Z","lastTransitionTime":"2026-03-18T15:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.985199 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:04 crc kubenswrapper[4939]: I0318 15:39:04.994552 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:04Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.086146 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.086182 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.086192 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.086206 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.086214 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.188171 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.188217 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.188231 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.188247 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.188257 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.290900 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.290945 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.290956 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.290974 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.290989 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.394657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.394722 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.394741 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.394768 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.394787 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.498102 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.498183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.498216 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.498241 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.498255 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.601116 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.601157 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.601174 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.601192 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.601202 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.703496 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.703568 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.703580 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.703597 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.703610 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.751060 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/1.log" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.751674 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/0.log" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.754421 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a" exitCode=1 Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.754466 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.754528 4939 scope.go:117] "RemoveContainer" containerID="d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.755374 4939 scope.go:117] "RemoveContainer" containerID="6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a" Mar 18 15:39:05 crc kubenswrapper[4939]: E0318 15:39:05.755647 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.767846 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.781086 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.793728 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.805622 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.805666 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.805679 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.805701 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.805714 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.814969 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.829229 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.848440 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.865926 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.879095 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.898356 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.908844 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.908883 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.908893 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.908918 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.908929 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:05Z","lastTransitionTime":"2026-03-18T15:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.914262 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.927112 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.945165 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.974808 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:03Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:03.190491 6839 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190659 6839 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190672 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:39:03.190721 6839 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:39:03.190728 6839 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:39:03.190759 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:03.190815 6839 factory.go:656] Stopping watch factory\\\\nI0318 15:39:03.190849 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:03.190867 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:03.190875 6839 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:03.190882 6839 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:05 crc kubenswrapper[4939]: I0318 15:39:05.988631 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:05Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.011446 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.011493 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.011546 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.011569 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.011580 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.113947 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.114013 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.114031 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.114097 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.114119 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.132831 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.132855 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.132933 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:06 crc kubenswrapper[4939]: E0318 15:39:06.133035 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:06 crc kubenswrapper[4939]: E0318 15:39:06.133211 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:06 crc kubenswrapper[4939]: E0318 15:39:06.133357 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.151545 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.171417 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.189569 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.208544 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.220354 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.220425 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.220451 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.220483 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.220565 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.225950 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.241757 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.260273 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.282381 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:03Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:03.190491 6839 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190659 6839 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190672 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:39:03.190721 6839 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:39:03.190728 6839 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:39:03.190759 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:03.190815 6839 factory.go:656] Stopping watch factory\\\\nI0318 15:39:03.190849 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:03.190867 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:03.190875 6839 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:03.190882 6839 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.298029 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.313792 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.322580 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.322637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.322647 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.322668 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.322683 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.327137 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.346294 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.360798 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.375037 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.425967 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.426036 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.426048 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.426068 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.426081 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.528588 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.528647 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.528657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.528676 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.528687 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.631823 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.631862 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.631872 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.631888 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.631897 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.633261 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh"] Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.633771 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.637186 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.637225 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.650972 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.662442 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.674879 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.688474 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.707215 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673cd4ff0817d38c340a45e3316c63a759e2562f8d65d498e981abe0ac41303\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:03Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:03.190491 6839 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190659 6839 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:03.190672 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:39:03.190721 6839 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 15:39:03.190728 6839 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 15:39:03.190759 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:03.190815 6839 factory.go:656] Stopping watch factory\\\\nI0318 15:39:03.190849 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:03.190867 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:03.190875 6839 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:03.190882 6839 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.718864 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.730227 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.733741 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.733788 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.733799 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.733814 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.733822 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.736309 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9tk\" (UniqueName: \"kubernetes.io/projected/61c39a12-cab4-4d84-854c-d88f02673ff5-kube-api-access-bs9tk\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.736384 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.736424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.736454 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61c39a12-cab4-4d84-854c-d88f02673ff5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.740984 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.752233 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.759008 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/1.log" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.764357 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.768112 4939 scope.go:117] "RemoveContainer" containerID="6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a" Mar 18 15:39:06 crc kubenswrapper[4939]: E0318 15:39:06.768345 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.777690 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.790286 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.813112 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.828324 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.835736 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.835774 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.835785 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.835803 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.835814 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.836988 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.837025 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61c39a12-cab4-4d84-854c-d88f02673ff5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.837045 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9tk\" (UniqueName: \"kubernetes.io/projected/61c39a12-cab4-4d84-854c-d88f02673ff5-kube-api-access-bs9tk\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.837079 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.837819 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.838276 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61c39a12-cab4-4d84-854c-d88f02673ff5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.839839 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.844983 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61c39a12-cab4-4d84-854c-d88f02673ff5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.846279 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.857309 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.864030 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9tk\" (UniqueName: \"kubernetes.io/projected/61c39a12-cab4-4d84-854c-d88f02673ff5-kube-api-access-bs9tk\") pod \"ovnkube-control-plane-749d76644c-v6zhh\" (UID: \"61c39a12-cab4-4d84-854c-d88f02673ff5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.870930 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.886245 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.896187 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.906474 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.922421 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.934397 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.938957 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.939001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.939014 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.939032 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.939046 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:06Z","lastTransitionTime":"2026-03-18T15:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.949783 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.951864 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.963383 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:06 crc kubenswrapper[4939]: W0318 15:39:06.969741 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61c39a12_cab4_4d84_854c_d88f02673ff5.slice/crio-8f134a3ba14bb2fabc7c03a4fc795b84be3631b0a1d834df43a9fc9e9cd39fcd WatchSource:0}: Error finding container 8f134a3ba14bb2fabc7c03a4fc795b84be3631b0a1d834df43a9fc9e9cd39fcd: Status 404 returned error can't find the container with id 8f134a3ba14bb2fabc7c03a4fc795b84be3631b0a1d834df43a9fc9e9cd39fcd Mar 18 15:39:06 crc kubenswrapper[4939]: I0318 15:39:06.978926 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:06Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.006762 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.019629 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.036013 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.046100 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.046139 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.046148 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.046161 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.046171 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.048578 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.063950 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.149895 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.150049 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.150063 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.150081 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.150091 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.257825 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.259121 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.259150 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.259183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.259207 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.361412 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.361470 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.361481 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.361516 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.361527 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.463832 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.463881 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.463893 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.463912 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.463923 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.566889 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.566915 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.566924 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.566937 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.566946 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.669231 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.669273 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.669286 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.669303 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.669315 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771356 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771394 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771406 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771425 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771438 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" event={"ID":"61c39a12-cab4-4d84-854c-d88f02673ff5","Type":"ContainerStarted","Data":"f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771522 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" event={"ID":"61c39a12-cab4-4d84-854c-d88f02673ff5","Type":"ContainerStarted","Data":"af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.771541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" event={"ID":"61c39a12-cab4-4d84-854c-d88f02673ff5","Type":"ContainerStarted","Data":"8f134a3ba14bb2fabc7c03a4fc795b84be3631b0a1d834df43a9fc9e9cd39fcd"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.802807 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.817121 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.832332 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.845128 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.860458 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.873802 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.873837 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.873849 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.873864 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.873876 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.874333 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.892452 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.904054 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.916090 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.928776 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.942194 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.955234 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.968151 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.975947 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.975977 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.975988 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.976005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.976015 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:07Z","lastTransitionTime":"2026-03-18T15:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.981827 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:07 crc kubenswrapper[4939]: I0318 15:39:07.994471 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:07Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.078761 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.078817 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.078831 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.078854 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.078868 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.126573 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zxrzw"] Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.127174 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.127268 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.132361 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.132490 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.132650 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.132689 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.132805 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.132978 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.149065 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.166835 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.180948 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.181333 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.181352 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.181361 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.181376 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.181386 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.195357 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.206321 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.229140 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.248257 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.250191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l8b4\" (UniqueName: \"kubernetes.io/projected/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-kube-api-access-8l8b4\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.250310 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.262135 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.280719 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.284429 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.284480 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.284493 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.284530 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.284543 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.294769 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.307245 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.323792 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.347987 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.351407 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l8b4\" (UniqueName: \"kubernetes.io/projected/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-kube-api-access-8l8b4\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.351490 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.351669 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.351736 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:08.851719981 +0000 UTC m=+113.450907602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.368560 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.380352 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l8b4\" (UniqueName: \"kubernetes.io/projected/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-kube-api-access-8l8b4\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.387161 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.388492 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.388544 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.388556 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.388572 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.388584 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.401907 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:08Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.491218 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.491277 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.491294 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.491317 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.491331 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.593830 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.593901 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.593926 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.593958 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.593981 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.697324 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.697394 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.697413 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.697439 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.697458 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.800909 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.800991 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.801005 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.801024 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.801063 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.856095 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.856274 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:08 crc kubenswrapper[4939]: E0318 15:39:08.856347 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:09.856328671 +0000 UTC m=+114.455516302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.904593 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.904654 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.904671 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.904694 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:08 crc kubenswrapper[4939]: I0318 15:39:08.904711 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:08Z","lastTransitionTime":"2026-03-18T15:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.007651 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.007728 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.007747 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.007775 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.007794 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.110954 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.111001 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.111017 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.111036 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.111048 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.214482 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.214609 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.214634 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.214668 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.214692 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.317584 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.317637 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.317651 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.317671 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.317684 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.421168 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.421233 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.421253 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.421281 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.421305 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.524328 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.524378 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.524390 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.524410 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.524427 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.628049 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.628150 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.628171 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.628203 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.628228 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.731706 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.731782 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.731800 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.731820 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.731835 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.834309 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.834365 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.834377 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.834394 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.834410 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.868347 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:09 crc kubenswrapper[4939]: E0318 15:39:09.868582 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:09 crc kubenswrapper[4939]: E0318 15:39:09.868712 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:11.868684225 +0000 UTC m=+116.467871846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.937908 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.937973 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.937990 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.938015 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:09 crc kubenswrapper[4939]: I0318 15:39:09.938032 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:09Z","lastTransitionTime":"2026-03-18T15:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.041081 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.041173 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.041187 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.041207 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.041220 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.133072 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.133224 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.133396 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:10 crc kubenswrapper[4939]: E0318 15:39:10.133403 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.133447 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:10 crc kubenswrapper[4939]: E0318 15:39:10.133564 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:10 crc kubenswrapper[4939]: E0318 15:39:10.133679 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:10 crc kubenswrapper[4939]: E0318 15:39:10.133719 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.143839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.144171 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.144251 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.144322 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.144378 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.247823 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.247856 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.247864 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.247880 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.247889 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.350744 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.350806 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.350823 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.350846 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.350862 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.453720 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.453782 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.453796 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.453813 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.453826 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.555979 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.556020 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.556031 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.556047 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.556057 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.659045 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.660067 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.660302 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.660466 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.660652 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.764224 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.764294 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.764319 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.764346 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.764366 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.868065 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.868301 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.868386 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.868459 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.868540 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.971026 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.971268 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.971366 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.971438 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:10 crc kubenswrapper[4939]: I0318 15:39:10.971494 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:10Z","lastTransitionTime":"2026-03-18T15:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.074981 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.075052 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.075076 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.075110 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.075134 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.178365 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.178450 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.178475 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.178549 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.178574 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.281585 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.282064 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.282298 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.282445 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.282671 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.384961 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.385003 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.385012 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.385026 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.385035 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.487656 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.487732 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.487756 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.487884 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.487910 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.590657 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.590776 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.590793 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.590817 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.590837 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.693589 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.693633 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.693645 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.693664 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.693676 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.796073 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.796173 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.796237 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.796263 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.796280 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.890173 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:11 crc kubenswrapper[4939]: E0318 15:39:11.890376 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:11 crc kubenswrapper[4939]: E0318 15:39:11.890468 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:15.890445517 +0000 UTC m=+120.489633138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.898202 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.898282 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.898307 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.898339 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:11 crc kubenswrapper[4939]: I0318 15:39:11.898362 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:11Z","lastTransitionTime":"2026-03-18T15:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.001199 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.001273 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.001291 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.001318 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.001335 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.104557 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.104625 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.104643 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.104669 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.104689 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.132407 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.132481 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.132407 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:12 crc kubenswrapper[4939]: E0318 15:39:12.132674 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.132696 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:12 crc kubenswrapper[4939]: E0318 15:39:12.132752 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:12 crc kubenswrapper[4939]: E0318 15:39:12.133061 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:12 crc kubenswrapper[4939]: E0318 15:39:12.133187 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.207115 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.207175 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.207195 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.207219 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.207236 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.310665 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.310741 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.310762 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.311241 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.311305 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.415158 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.415196 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.415207 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.415223 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.415238 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.518972 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.519010 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.519023 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.519042 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.519053 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.622242 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.622293 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.622309 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.622329 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.622344 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.724767 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.724832 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.724848 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.724872 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.724888 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.827234 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.827270 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.827288 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.827321 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.827332 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.930758 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.930798 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.930811 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.930828 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:12 crc kubenswrapper[4939]: I0318 15:39:12.930840 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:12Z","lastTransitionTime":"2026-03-18T15:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.033691 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.033726 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.033736 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.033933 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.033944 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.136611 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.136674 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.136688 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.136708 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.136719 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.239428 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.239492 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.239529 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.239547 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.239567 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.342554 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.343032 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.343194 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.343360 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.343547 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.445942 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.446012 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.446031 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.446060 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.446095 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.548924 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.548968 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.548979 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.548997 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.549009 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.651179 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.651220 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.651234 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.651251 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.651262 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.754160 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.754198 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.754209 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.754248 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.754260 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.858688 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.858750 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.858761 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.858777 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.858792 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.960943 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.961249 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.961314 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.961382 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:13 crc kubenswrapper[4939]: I0318 15:39:13.961450 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:13Z","lastTransitionTime":"2026-03-18T15:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.064421 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.064474 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.064494 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.064560 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.064577 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.133184 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.133204 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.133204 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.133756 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.133795 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.133240 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.133883 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.133967 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.166979 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.167018 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.167030 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.167046 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.167057 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.269861 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.269907 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.269916 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.269932 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.269944 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.373127 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.373169 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.373180 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.373195 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.373206 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.475917 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.476238 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.476322 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.476443 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.476548 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.508219 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.508271 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.508285 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.508303 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.508318 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.530133 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:14Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.534977 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.535020 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.535033 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.535050 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.535063 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.551569 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:14Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.555687 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.555841 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.555934 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.556022 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.556127 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.571223 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:14Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.575043 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.575186 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.575262 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.575325 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.575401 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.594953 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:14Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.603686 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.603759 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.603774 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.603793 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.603804 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.619865 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:14Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:14 crc kubenswrapper[4939]: E0318 15:39:14.620104 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.622444 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.622498 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.622538 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.622557 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.622569 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.725411 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.725471 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.725482 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.725498 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.725524 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.828003 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.828041 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.828050 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.828064 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.828072 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.930956 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.931034 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.931052 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.931079 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:14 crc kubenswrapper[4939]: I0318 15:39:14.931098 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:14Z","lastTransitionTime":"2026-03-18T15:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.035324 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.035378 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.035399 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.035424 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.035441 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.143950 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.144009 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.144026 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.144050 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.144067 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.247306 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.247581 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.247601 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.247622 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.247636 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.350763 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.350838 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.350863 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.350895 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.350919 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.454666 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.454773 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.454810 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.454846 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.454868 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.557895 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.557945 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.557964 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.557988 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.558005 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.661816 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.661870 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.661889 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.661916 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.661931 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.764644 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.764684 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.764694 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.764709 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.764720 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.867719 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.867765 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.867777 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.867798 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.867813 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.932074 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:15 crc kubenswrapper[4939]: E0318 15:39:15.932254 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:15 crc kubenswrapper[4939]: E0318 15:39:15.932310 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:23.932294358 +0000 UTC m=+128.531481979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.971452 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.971533 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.971551 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.971571 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:15 crc kubenswrapper[4939]: I0318 15:39:15.971584 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:15Z","lastTransitionTime":"2026-03-18T15:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.072047 4939 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.132628 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.132698 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.132646 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.132873 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.132937 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.133076 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.133223 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.133640 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.149807 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.168459 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.183098 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.195900 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.209592 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.223417 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.236036 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: E0318 15:39:16.248266 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.251936 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.273453 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.290236 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.307556 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.325476 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.350620 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.369219 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.387401 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:16 crc kubenswrapper[4939]: I0318 15:39:16.399083 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:16Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:17 crc kubenswrapper[4939]: I0318 15:39:17.151368 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 15:39:18 crc kubenswrapper[4939]: I0318 15:39:18.132210 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:18 crc kubenswrapper[4939]: E0318 15:39:18.132624 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:18 crc kubenswrapper[4939]: I0318 15:39:18.132328 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:18 crc kubenswrapper[4939]: E0318 15:39:18.132865 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:18 crc kubenswrapper[4939]: I0318 15:39:18.132328 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:18 crc kubenswrapper[4939]: E0318 15:39:18.133061 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:18 crc kubenswrapper[4939]: I0318 15:39:18.132425 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:18 crc kubenswrapper[4939]: E0318 15:39:18.133468 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.132999 4939 scope.go:117] "RemoveContainer" containerID="6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.821368 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/1.log" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.824520 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852"} Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.825037 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.841158 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.855991 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.868274 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.898807 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.928614 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.945151 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.957639 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.972162 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.985907 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:19 crc kubenswrapper[4939]: I0318 15:39:19.997407 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:19Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.014947 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.039396 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.051772 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.068775 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.081805 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.102612 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.121375 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.132702 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.132813 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.132710 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:20 crc kubenswrapper[4939]: E0318 15:39:20.132897 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.132711 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:20 crc kubenswrapper[4939]: E0318 15:39:20.132965 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:20 crc kubenswrapper[4939]: E0318 15:39:20.133057 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:20 crc kubenswrapper[4939]: E0318 15:39:20.133138 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.831574 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/2.log" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.832442 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/1.log" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.835617 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" exitCode=1 Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.835691 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852"} Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.835849 4939 scope.go:117] "RemoveContainer" containerID="6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.836434 4939 scope.go:117] "RemoveContainer" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" Mar 18 15:39:20 crc kubenswrapper[4939]: E0318 15:39:20.836662 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.867816 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.891415 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.902858 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.923118 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.948604 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac64967800c1d177ee9474c9f07c73197ac9137ede6b53ccc084e59dc9e9d5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:04Z\\\",\\\"message\\\":\\\"o:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:39:04.586977 6988 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587069 6988 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587726 6988 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:39:04.587830 6988 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nI0318 15:39:04.587849 6988 services_controller.go:360] Finished syncing service machine-config-operator on namespace openshift-machine-config-operator for network=default : 13.345463ms\\\\nI0318 15:39:04.587982 6988 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:04.588029 6988 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.963041 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.978405 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:20 crc kubenswrapper[4939]: I0318 15:39:20.990105 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.001107 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.016164 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.036847 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.049748 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.063044 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.076583 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.105295 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.124212 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.143639 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: E0318 15:39:21.249108 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.841476 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/2.log" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.845611 4939 scope.go:117] "RemoveContainer" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" Mar 18 15:39:21 crc kubenswrapper[4939]: E0318 15:39:21.846198 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.863695 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.884703 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.897068 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.915603 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.943885 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.956607 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.969604 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.980901 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:21 crc kubenswrapper[4939]: I0318 15:39:21.994590 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.008638 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.022203 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.036419 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.050785 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.095143 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.128626 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.132485 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.132542 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.132605 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:22 crc kubenswrapper[4939]: E0318 15:39:22.132738 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.132772 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:22 crc kubenswrapper[4939]: E0318 15:39:22.132856 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:22 crc kubenswrapper[4939]: E0318 15:39:22.132970 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:22 crc kubenswrapper[4939]: E0318 15:39:22.133083 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.149517 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.164453 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:22 crc kubenswrapper[4939]: I0318 15:39:22.450691 4939 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.019317 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.019582 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.019722 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:39:40.019692522 +0000 UTC m=+144.618880173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.132622 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.132681 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.132696 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.132809 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.132805 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.132922 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.133047 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.133162 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.870019 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.870092 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.870102 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.870117 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.870127 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:24Z","lastTransitionTime":"2026-03-18T15:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.886004 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:24Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.893772 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.893843 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.893860 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.893883 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.893907 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:24Z","lastTransitionTime":"2026-03-18T15:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.910777 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:24Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.915742 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.915779 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.915791 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.915811 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.915826 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:24Z","lastTransitionTime":"2026-03-18T15:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.930123 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:24Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.934228 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.934259 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.934272 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.934291 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.934307 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:24Z","lastTransitionTime":"2026-03-18T15:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.953744 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:24Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.958617 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.958670 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.958687 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.958707 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:24 crc kubenswrapper[4939]: I0318 15:39:24.958724 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:24Z","lastTransitionTime":"2026-03-18T15:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.974284 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:24Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:24 crc kubenswrapper[4939]: E0318 15:39:24.974456 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.132811 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.132876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.132937 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.133056 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:26 crc kubenswrapper[4939]: E0318 15:39:26.133051 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:26 crc kubenswrapper[4939]: E0318 15:39:26.133240 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:26 crc kubenswrapper[4939]: E0318 15:39:26.133337 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:26 crc kubenswrapper[4939]: E0318 15:39:26.133410 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.154533 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.179876 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.197634 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.224312 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.243933 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: E0318 15:39:26.249919 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.263015 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.282782 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.306626 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.324771 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.344199 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.359171 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.391954 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.427559 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.449382 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.468025 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.482448 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:26 crc kubenswrapper[4939]: I0318 15:39:26.506330 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:28 crc kubenswrapper[4939]: I0318 15:39:28.133202 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:28 crc kubenswrapper[4939]: I0318 15:39:28.133321 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:28 crc kubenswrapper[4939]: I0318 15:39:28.133356 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:28 crc kubenswrapper[4939]: I0318 15:39:28.133416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:28 crc kubenswrapper[4939]: E0318 15:39:28.133665 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:28 crc kubenswrapper[4939]: E0318 15:39:28.133742 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:28 crc kubenswrapper[4939]: E0318 15:39:28.133830 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:28 crc kubenswrapper[4939]: E0318 15:39:28.133926 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:30 crc kubenswrapper[4939]: I0318 15:39:30.132676 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:30 crc kubenswrapper[4939]: I0318 15:39:30.132708 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:30 crc kubenswrapper[4939]: I0318 15:39:30.132860 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:30 crc kubenswrapper[4939]: E0318 15:39:30.133024 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:30 crc kubenswrapper[4939]: E0318 15:39:30.133164 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:30 crc kubenswrapper[4939]: I0318 15:39:30.133166 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:30 crc kubenswrapper[4939]: E0318 15:39:30.133245 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:30 crc kubenswrapper[4939]: E0318 15:39:30.134290 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:31 crc kubenswrapper[4939]: I0318 15:39:31.146494 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 15:39:31 crc kubenswrapper[4939]: E0318 15:39:31.251815 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:32 crc kubenswrapper[4939]: I0318 15:39:32.133028 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:32 crc kubenswrapper[4939]: E0318 15:39:32.133252 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:32 crc kubenswrapper[4939]: I0318 15:39:32.133708 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:32 crc kubenswrapper[4939]: E0318 15:39:32.133926 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:32 crc kubenswrapper[4939]: I0318 15:39:32.134030 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:32 crc kubenswrapper[4939]: I0318 15:39:32.134251 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:32 crc kubenswrapper[4939]: E0318 15:39:32.134433 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:32 crc kubenswrapper[4939]: E0318 15:39:32.134702 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:34 crc kubenswrapper[4939]: I0318 15:39:34.132861 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:34 crc kubenswrapper[4939]: I0318 15:39:34.132907 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:34 crc kubenswrapper[4939]: E0318 15:39:34.133077 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:34 crc kubenswrapper[4939]: I0318 15:39:34.133093 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:34 crc kubenswrapper[4939]: E0318 15:39:34.133903 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:34 crc kubenswrapper[4939]: I0318 15:39:34.133933 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:34 crc kubenswrapper[4939]: E0318 15:39:34.134056 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:34 crc kubenswrapper[4939]: E0318 15:39:34.134167 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.133371 4939 scope.go:117] "RemoveContainer" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.135086 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.205980 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.206038 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.206059 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.206082 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.206100 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:35Z","lastTransitionTime":"2026-03-18T15:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.224276 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.230145 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.230214 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.230233 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.230258 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.230277 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:35Z","lastTransitionTime":"2026-03-18T15:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.250026 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.255095 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.255173 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.255199 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.255231 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.255263 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:35Z","lastTransitionTime":"2026-03-18T15:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.277389 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.282839 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.282902 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.282919 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.282943 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.282961 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:35Z","lastTransitionTime":"2026-03-18T15:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.301159 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.306066 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.306135 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.306147 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.306167 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:35 crc kubenswrapper[4939]: I0318 15:39:35.306181 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:35Z","lastTransitionTime":"2026-03-18T15:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.323540 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:35 crc kubenswrapper[4939]: E0318 15:39:35.323656 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.070148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.070627 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:40.070583149 +0000 UTC m=+204.669770810 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.132817 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.132874 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.132817 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.132891 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.133196 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.133637 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.134057 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.134199 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.150402 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.155727 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.171320 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.171383 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.171437 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.171496 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171594 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171646 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171595 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171662 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171772 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:40:40.171739289 +0000 UTC m=+204.770926920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171847 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171886 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:40:40.171857942 +0000 UTC m=+204.771045603 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171905 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171932 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.171933 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.172006 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:40:40.171992246 +0000 UTC m=+204.771179897 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.172067 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:40:40.172020367 +0000 UTC m=+204.771208028 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.174497 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.191702 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.212585 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.232384 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: E0318 15:39:36.252568 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.264105 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.289417 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.311236 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.329439 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.353645 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.368271 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.386070 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.414872 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.431740 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.445954 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.468058 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.489102 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:36 crc kubenswrapper[4939]: I0318 15:39:36.503732 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:38 crc kubenswrapper[4939]: I0318 15:39:38.133115 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:38 crc kubenswrapper[4939]: I0318 15:39:38.133188 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:38 crc kubenswrapper[4939]: I0318 15:39:38.133210 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:38 crc kubenswrapper[4939]: I0318 15:39:38.133360 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:38 crc kubenswrapper[4939]: E0318 15:39:38.133346 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:38 crc kubenswrapper[4939]: E0318 15:39:38.133625 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:38 crc kubenswrapper[4939]: E0318 15:39:38.133980 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:38 crc kubenswrapper[4939]: E0318 15:39:38.134206 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.116233 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.116543 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.116710 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:40:12.116677569 +0000 UTC m=+176.715865400 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.132967 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.132987 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.133030 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.133109 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.133343 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.133465 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.133631 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:40 crc kubenswrapper[4939]: E0318 15:39:40.133754 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.939607 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/0.log" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.939708 4939 generic.go:334] "Generic (PLEG): container finished" podID="6693c593-9b18-435e-8a3a-91d3e33c3c51" containerID="428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935" exitCode=1 Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.939764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerDied","Data":"428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935"} Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.940350 4939 scope.go:117] "RemoveContainer" containerID="428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.971359 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:40 crc kubenswrapper[4939]: I0318 15:39:40.989977 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.006494 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.025723 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.041430 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.060592 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.077636 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.099486 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.112587 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.123033 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.135539 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.148594 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.163691 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.176757 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.192628 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.209203 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.224109 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.242558 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: E0318 15:39:41.255011 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.272017 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.945129 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/0.log" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.945219 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerStarted","Data":"afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e"} Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.964245 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.976486 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:41 crc kubenswrapper[4939]: I0318 15:39:41.991166 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.009949 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.022464 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.032765 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.043998 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.058171 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.071807 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.083239 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.092924 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.103852 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.122421 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.132664 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.132707 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.132745 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:42 crc kubenswrapper[4939]: E0318 15:39:42.132885 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.132907 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:42 crc kubenswrapper[4939]: E0318 15:39:42.133017 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:42 crc kubenswrapper[4939]: E0318 15:39:42.133115 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:42 crc kubenswrapper[4939]: E0318 15:39:42.133278 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.137462 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.158638 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.175675 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.186267 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.197904 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:42 crc kubenswrapper[4939]: I0318 15:39:42.211706 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:44 crc kubenswrapper[4939]: I0318 15:39:44.132539 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:44 crc kubenswrapper[4939]: I0318 15:39:44.132642 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:44 crc kubenswrapper[4939]: I0318 15:39:44.132643 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:44 crc kubenswrapper[4939]: I0318 15:39:44.132559 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:44 crc kubenswrapper[4939]: E0318 15:39:44.132748 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:44 crc kubenswrapper[4939]: E0318 15:39:44.132889 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:44 crc kubenswrapper[4939]: E0318 15:39:44.133052 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:44 crc kubenswrapper[4939]: E0318 15:39:44.133222 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.413174 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.413229 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.413240 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.413259 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.413273 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:45Z","lastTransitionTime":"2026-03-18T15:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.429994 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.435535 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.435621 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.435643 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.435670 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.435688 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:45Z","lastTransitionTime":"2026-03-18T15:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.451036 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.455920 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.456011 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.456028 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.456046 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.456083 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:45Z","lastTransitionTime":"2026-03-18T15:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.469874 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.473920 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.474071 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.474164 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.474266 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.474359 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:45Z","lastTransitionTime":"2026-03-18T15:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.492478 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.500090 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.500140 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.500156 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.500183 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:45 crc kubenswrapper[4939]: I0318 15:39:45.500198 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:45Z","lastTransitionTime":"2026-03-18T15:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.519926 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:45Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:45 crc kubenswrapper[4939]: E0318 15:39:45.520107 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.133071 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:46 crc kubenswrapper[4939]: E0318 15:39:46.133677 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.133105 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.133198 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:46 crc kubenswrapper[4939]: E0318 15:39:46.134010 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:46 crc kubenswrapper[4939]: E0318 15:39:46.134087 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.133153 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:46 crc kubenswrapper[4939]: E0318 15:39:46.134236 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.157180 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.175748 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.191426 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.209442 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.243131 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: E0318 15:39:46.256008 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.273275 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.292328 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.312018 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.334287 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.350057 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.362154 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.375851 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.402402 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.420907 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.435898 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.459633 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.478942 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.500573 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:46 crc kubenswrapper[4939]: I0318 15:39:46.516127 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:46Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:48 crc kubenswrapper[4939]: I0318 15:39:48.132876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:48 crc kubenswrapper[4939]: I0318 15:39:48.132985 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:48 crc kubenswrapper[4939]: I0318 15:39:48.132920 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:48 crc kubenswrapper[4939]: E0318 15:39:48.133107 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:48 crc kubenswrapper[4939]: I0318 15:39:48.133178 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:48 crc kubenswrapper[4939]: E0318 15:39:48.133309 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:48 crc kubenswrapper[4939]: E0318 15:39:48.133501 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:48 crc kubenswrapper[4939]: E0318 15:39:48.133676 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:49 crc kubenswrapper[4939]: I0318 15:39:49.133153 4939 scope.go:117] "RemoveContainer" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" Mar 18 15:39:49 crc kubenswrapper[4939]: I0318 15:39:49.977698 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/2.log" Mar 18 15:39:49 crc kubenswrapper[4939]: I0318 15:39:49.981220 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:39:49 crc kubenswrapper[4939]: I0318 15:39:49.981745 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:39:49 crc kubenswrapper[4939]: I0318 15:39:49.994595 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:49Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.008866 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.031233 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.056878 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.071425 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.080996 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.092909 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.103545 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.115293 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.126897 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.132247 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.132283 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.132295 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.132360 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:50 crc kubenswrapper[4939]: E0318 15:39:50.132377 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:50 crc kubenswrapper[4939]: E0318 15:39:50.132533 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:50 crc kubenswrapper[4939]: E0318 15:39:50.132650 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:50 crc kubenswrapper[4939]: E0318 15:39:50.132694 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.142100 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.154899 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.167977 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.179182 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.194795 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.216156 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.233074 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.247177 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.258816 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:50Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.986020 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/3.log" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.986985 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/2.log" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.990165 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" exitCode=1 Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.990226 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.990293 4939 scope.go:117] "RemoveContainer" containerID="f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852" Mar 18 15:39:50 crc kubenswrapper[4939]: I0318 15:39:50.990830 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:39:50 crc kubenswrapper[4939]: E0318 15:39:50.990990 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.007140 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.027166 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5992a09dcbf0e66054771214db7e4e4d98e7b8caa4aaaa1dcd4f19443e72852\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:20Z\\\",\\\"message\\\":\\\"0318 15:39:20.128572 7242 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:39:20.128616 7242 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:50Z\\\",\\\"message\\\":\\\" 7569 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 15:39:50.074252 7569 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:50.074571 7569 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 15:39:50.074584 7569 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 15:39:50.074610 7569 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:39:50.074825 7569 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:39:50.074837 7569 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:50.074849 7569 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:39:50.074851 7569 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:50.074846 7569 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 15:39:50.074858 7569 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:50.074852 7569 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:39:50.074879 7569 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 15:39:50.074918 7569 factory.go:656] Stopping watch factory\\\\nI0318 15:39:50.074936 7569 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:50.074979 7569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:50.075048 7569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.043716 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.056333 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.073724 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.091921 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.107391 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.122153 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.139817 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.161984 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.174617 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.190055 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.212458 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.228461 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.239681 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.251045 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: E0318 15:39:51.257791 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.265762 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.288513 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.306183 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.995878 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/3.log" Mar 18 15:39:51 crc kubenswrapper[4939]: I0318 15:39:51.998808 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:39:51 crc kubenswrapper[4939]: E0318 15:39:51.998938 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.015651 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.030072 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.040943 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.054026 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.071277 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:50Z\\\",\\\"message\\\":\\\" 7569 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 15:39:50.074252 7569 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:50.074571 7569 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 15:39:50.074584 7569 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 15:39:50.074610 7569 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:39:50.074825 7569 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:39:50.074837 7569 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:50.074849 7569 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:39:50.074851 7569 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:50.074846 7569 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 15:39:50.074858 7569 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:50.074852 7569 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:39:50.074879 7569 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 15:39:50.074918 7569 factory.go:656] Stopping watch factory\\\\nI0318 15:39:50.074936 7569 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:50.074979 7569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:50.075048 7569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.081768 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.091312 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.102827 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.112999 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.124958 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.132625 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.132625 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:52 crc kubenswrapper[4939]: E0318 15:39:52.133012 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.132693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:52 crc kubenswrapper[4939]: E0318 15:39:52.133326 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.132660 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:52 crc kubenswrapper[4939]: E0318 15:39:52.133096 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:52 crc kubenswrapper[4939]: E0318 15:39:52.133628 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.138568 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.158632 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.169870 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.180087 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.194063 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.217184 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.231881 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.242859 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:52 crc kubenswrapper[4939]: I0318 15:39:52.257812 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:54 crc kubenswrapper[4939]: I0318 15:39:54.132655 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:54 crc kubenswrapper[4939]: I0318 15:39:54.132757 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:54 crc kubenswrapper[4939]: I0318 15:39:54.132793 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:54 crc kubenswrapper[4939]: I0318 15:39:54.132855 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:54 crc kubenswrapper[4939]: E0318 15:39:54.133883 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:54 crc kubenswrapper[4939]: E0318 15:39:54.134090 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:54 crc kubenswrapper[4939]: E0318 15:39:54.134174 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:54 crc kubenswrapper[4939]: E0318 15:39:54.134236 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.683011 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.683072 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.683086 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.683108 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.683123 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:55Z","lastTransitionTime":"2026-03-18T15:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.704621 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.708739 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.708795 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.708813 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.708835 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.708853 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:55Z","lastTransitionTime":"2026-03-18T15:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.723394 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.727157 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.727184 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.727197 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.727213 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.727224 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:55Z","lastTransitionTime":"2026-03-18T15:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.739202 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.742817 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.742853 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.742862 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.742889 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.742898 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:55Z","lastTransitionTime":"2026-03-18T15:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.754095 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.758114 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.758156 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.758169 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.758189 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:39:55 crc kubenswrapper[4939]: I0318 15:39:55.758201 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:39:55Z","lastTransitionTime":"2026-03-18T15:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.770677 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"63560ce4-57c9-4ea0-827a-ea8b1db6e8ed\\\",\\\"systemUUID\\\":\\\"e4c08689-884e-465b-8c84-d257e1c69929\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:55Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:55 crc kubenswrapper[4939]: E0318 15:39:55.770806 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.132684 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.132691 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:56 crc kubenswrapper[4939]: E0318 15:39:56.132850 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.132948 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:56 crc kubenswrapper[4939]: E0318 15:39:56.133073 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.133179 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:56 crc kubenswrapper[4939]: E0318 15:39:56.133273 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:39:56 crc kubenswrapper[4939]: E0318 15:39:56.133389 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.146723 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1463a2e7-d265-47a9-ae4d-ec7ceefac82d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de829f3534b9801801a7f04f06a6cb433b1fd3f526d4839937eb339247698b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e2cb76ded4e546164b945d28694d7e9b05fd49e02994370d29e98d65d802ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284698f1a795d13e78a14255f8222b7f5da68295476c6568d60b52a200485e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3febf2bd2b743cc99313defd34d6fa456e0989ff46ba52f0aca6079b90b6f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.162867 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a32d41a6-8ebb-4871-b660-91407cbaa5c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26ce0c064105ddc9d1eca10f732a43f8a65dce0d81643ea50d3f919c32cadd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wljc2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6q7lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.180566 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"055d05da-715f-47bd-88a3-6a93965a2f65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152466377feb71d0753605156bebe25a55543da2dfefcbea8ae818ad6f90ccc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6764a0c26585e7a17d79cbbff149fb168e5e774b7b9b16988d6123de0fe8e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c8ef9991c59e7b6debdf6eb3d603720ff2dec08524eafe0f2a90d438be19d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9974ee9fb2e7d535ae3a756c9b90a85603eac76f72b6e55172aa321fef31d444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c66dad59cc4b150322e755bc0ecc1e083eeb23e1167dda494ea9b3b6943b274\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec022fe8bd9858f089c3280f7269deca016c934f3cb263b04044f4f949d675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcd210ce0307de280c05850d7b4a44ad1f53f982cc234c37cc353f2cd5e33f70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pzh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x2ztl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.197511 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61c39a12-cab4-4d84-854c-d88f02673ff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08084778bd2b6c6d53e13d88cf74139b9cbace3449bf25470fa98276f7be4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f10fe4e972428fe7fa52c5783b29dfaa8cb3b01035b1337d22885d31a2acd7af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs9tk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v6zhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.211700 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ff58032-08dd-4156-9966-bc375df9890b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af12c73b2b16ccc2a4d0c423f862f0be1d11c38a129a43ddd66bd01ed70bbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26fca432d4df87d2e4652649ac298bd3fb4b601355b77381545a5dd451bb7442\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.225481 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.246302 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4b97d4301907b5427730da483fa386069940e5e82ca21dbde8664332779a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4e35342f9f3c05b6c944db3c420aafde685cd9c76b3be47e96071d4f616faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: E0318 15:39:56.258133 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.269245 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8ba7e5b2bb3e238c7765f9b3d76b0781f29bbe3c0f1f26eba504ed82d73a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.283179 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4ptxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ddaef3-086a-4a4d-931d-c0e82663eb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41fa5bf4a00cd570d6bc8a310c8f33f106ca76476c9170d7cad00e1c412982cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88bxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4ptxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.297658 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmzwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6693c593-9b18-435e-8a3a-91d3e33c3c51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:40Z\\\",\\\"message\\\":\\\"2026-03-18T15:38:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448\\\\n2026-03-18T15:38:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4550bcde-9a69-4e2a-9641-e31ec49e6448 to /host/opt/cni/bin/\\\\n2026-03-18T15:38:55Z [verbose] multus-daemon started\\\\n2026-03-18T15:38:55Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:39:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9277t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmzwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.321406 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafcc67-568f-415b-b907-c1de4c851fa7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:39:50Z\\\",\\\"message\\\":\\\" 7569 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 15:39:50.074252 7569 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:39:50.074571 7569 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 15:39:50.074584 7569 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 15:39:50.074610 7569 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:39:50.074825 7569 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:39:50.074837 7569 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:39:50.074849 7569 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:39:50.074851 7569 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:39:50.074846 7569 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 15:39:50.074858 7569 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 15:39:50.074852 7569 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:39:50.074879 7569 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 15:39:50.074918 7569 factory.go:656] Stopping watch factory\\\\nI0318 15:39:50.074936 7569 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:39:50.074979 7569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:39:50.075048 7569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:39:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:38:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p47dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l79pv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.342440 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940ef118c97a04b69943701779c927c118022600b07c46b170e3ab2cc2293883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.361466 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.377679 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-49sqv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8626b206-a707-4a0f-b29f-b6fd365b1a89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://537bfaacee10b3e3af033b0f46430329734a30a803b3beba39e38bb7ba72277a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4nnf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:38:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-49sqv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.395286 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b844d2ea-dd82-47e2-b1e9-16de92e0bd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7cc4b389f380813707382a50d28fee9c58f25f8d84d07d6ae9b1f34d54c980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d38bb0a10b561012549978d127270647e38bbac61983b790bc164b8117f1c8e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:37:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:37:18.267489 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:37:18.270138 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:37:18.300897 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:37:18.305738 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:37:44.831344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:37:44.831559 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:37:44Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546d4f485e727978abc7581d2d9d749fe8dd7e9b84e7311182c93e121368b187\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5587f89f9581e5253836f3c3f265e89010ce8ecf6fa871cfb6bf730862286e53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.421917 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"044378d9-0175-40a3-8007-0c50a40d940c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7af3a558926269e0a75789edb7004486c4a0f7e0047838000613ef43f1aeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42533f824fe04c5d2c7314afbd1d2f4ca61238b5e71f427bb2a3b782bcc5e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b120ea9c3354670666c75517e0e54737173aa60ffaadd47da09d4a5c0b030140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7adf73e061df8c7605459c7e28b0a80849e24de998e3c7e80d0347c4591909e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ed8246b0d68ff7c5481e09408d664f3c8677310a1b80bcb9ea026efce31b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf8cba0536275e969d4ebe6882699df4e3986bfb0f054c8997c34f03d452080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea99be3dd62a4f0487bc89f06ecc3db07fa953216256e2e6b1b5aec00bfd9512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfcc5176da6a372855cb156a89f6d0a8d999d3d85596229ff95610026a847bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.444810 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab41abb6-5f2e-4c42-a3a2-9e24834d5298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:37:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:38:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:38:12.281352 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:38:12.281477 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:38:12.282170 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2843065120/tls.crt::/tmp/serving-cert-2843065120/tls.key\\\\\\\"\\\\nI0318 15:38:12.708552 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:38:12.712255 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:38:12.712277 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:38:12.712308 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:38:12.712315 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:38:12.720021 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0318 15:38:12.720018 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 15:38:12.720046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:38:12.720055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:38:12.720058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:38:12.720061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:38:12.720064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:38:12.723974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:38:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:37:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:37:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:37:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:37:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.465000 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:38:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:56 crc kubenswrapper[4939]: I0318 15:39:56.478356 4939 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:39:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l8b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:39:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zxrzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:39:56Z is after 2025-08-24T17:21:41Z" Mar 18 15:39:58 crc kubenswrapper[4939]: I0318 15:39:58.132931 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:39:58 crc kubenswrapper[4939]: I0318 15:39:58.132961 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:39:58 crc kubenswrapper[4939]: I0318 15:39:58.133098 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:39:58 crc kubenswrapper[4939]: E0318 15:39:58.133104 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:39:58 crc kubenswrapper[4939]: I0318 15:39:58.133147 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:39:58 crc kubenswrapper[4939]: E0318 15:39:58.133263 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:39:58 crc kubenswrapper[4939]: E0318 15:39:58.133370 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:39:58 crc kubenswrapper[4939]: E0318 15:39:58.133416 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:00 crc kubenswrapper[4939]: I0318 15:40:00.132559 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:00 crc kubenswrapper[4939]: I0318 15:40:00.132911 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:00 crc kubenswrapper[4939]: I0318 15:40:00.132948 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:00 crc kubenswrapper[4939]: E0318 15:40:00.133148 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:00 crc kubenswrapper[4939]: E0318 15:40:00.133231 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:00 crc kubenswrapper[4939]: I0318 15:40:00.133259 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:00 crc kubenswrapper[4939]: E0318 15:40:00.133303 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:00 crc kubenswrapper[4939]: E0318 15:40:00.133360 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:01 crc kubenswrapper[4939]: E0318 15:40:01.260536 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:02 crc kubenswrapper[4939]: I0318 15:40:02.132643 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:02 crc kubenswrapper[4939]: I0318 15:40:02.132643 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:02 crc kubenswrapper[4939]: I0318 15:40:02.132671 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:02 crc kubenswrapper[4939]: I0318 15:40:02.132828 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:02 crc kubenswrapper[4939]: E0318 15:40:02.133055 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:02 crc kubenswrapper[4939]: E0318 15:40:02.133241 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:02 crc kubenswrapper[4939]: E0318 15:40:02.133350 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:02 crc kubenswrapper[4939]: E0318 15:40:02.133419 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:04 crc kubenswrapper[4939]: I0318 15:40:04.132546 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:04 crc kubenswrapper[4939]: I0318 15:40:04.132570 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:04 crc kubenswrapper[4939]: E0318 15:40:04.132696 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:04 crc kubenswrapper[4939]: E0318 15:40:04.132825 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:04 crc kubenswrapper[4939]: I0318 15:40:04.133714 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:04 crc kubenswrapper[4939]: I0318 15:40:04.133748 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:04 crc kubenswrapper[4939]: E0318 15:40:04.133875 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:04 crc kubenswrapper[4939]: E0318 15:40:04.133921 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.101405 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.101470 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.101487 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.101542 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.101562 4939 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:40:06Z","lastTransitionTime":"2026-03-18T15:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.132138 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.132221 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.132258 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.132496 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.132584 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.132692 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.132975 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.133184 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.133639 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.133836 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.160558 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.166949 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=30.166925393 podStartE2EDuration="30.166925393s" podCreationTimestamp="2026-03-18 15:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.153364628 +0000 UTC m=+170.752552279" watchObservedRunningTime="2026-03-18 15:40:06.166925393 +0000 UTC m=+170.766113014" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.167184 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr"] Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.167627 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.169840 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.170334 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.171157 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.172031 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podStartSLOduration=110.17201254 podStartE2EDuration="1m50.17201254s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.169660772 +0000 UTC m=+170.768848463" watchObservedRunningTime="2026-03-18 15:40:06.17201254 +0000 UTC m=+170.771200161" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.175080 4939 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.175818 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.214027 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x2ztl" podStartSLOduration=110.213997321 podStartE2EDuration="1m50.213997321s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.200356194 +0000 UTC m=+170.799543815" watchObservedRunningTime="2026-03-18 15:40:06.213997321 +0000 UTC m=+170.813184992" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.242074 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4ptxp" podStartSLOduration=110.242055356 podStartE2EDuration="1m50.242055356s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.225853665 +0000 UTC m=+170.825041286" watchObservedRunningTime="2026-03-18 15:40:06.242055356 +0000 UTC m=+170.841242977" Mar 18 15:40:06 crc kubenswrapper[4939]: E0318 15:40:06.262004 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.270217 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xmzwg" podStartSLOduration=110.270198594 podStartE2EDuration="1m50.270198594s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.243329803 +0000 UTC m=+170.842517464" watchObservedRunningTime="2026-03-18 15:40:06.270198594 +0000 UTC m=+170.869386215" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.280316 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v6zhh" podStartSLOduration=110.280302748 podStartE2EDuration="1m50.280302748s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.280064691 +0000 UTC m=+170.879252322" watchObservedRunningTime="2026-03-18 15:40:06.280302748 +0000 UTC m=+170.879490369" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.292763 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=35.29274713 podStartE2EDuration="35.29274713s" podCreationTimestamp="2026-03-18 15:39:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.292438341 +0000 UTC m=+170.891625952" watchObservedRunningTime="2026-03-18 15:40:06.29274713 +0000 UTC m=+170.891934751" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.301235 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.301277 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365d0573-86d9-4320-b159-8d10af9c6b7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.301312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.301612 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/365d0573-86d9-4320-b159-8d10af9c6b7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.301678 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365d0573-86d9-4320-b159-8d10af9c6b7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.377069 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-49sqv" podStartSLOduration=110.37705476 podStartE2EDuration="1m50.37705476s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.363711592 +0000 UTC m=+170.962899213" watchObservedRunningTime="2026-03-18 15:40:06.37705476 +0000 UTC m=+170.976242371" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.400877 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=49.400861332 podStartE2EDuration="49.400861332s" podCreationTimestamp="2026-03-18 15:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.399585315 +0000 UTC m=+170.998772956" watchObservedRunningTime="2026-03-18 15:40:06.400861332 +0000 UTC m=+171.000048953" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402150 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402197 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365d0573-86d9-4320-b159-8d10af9c6b7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402231 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402257 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/365d0573-86d9-4320-b159-8d10af9c6b7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402275 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365d0573-86d9-4320-b159-8d10af9c6b7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402569 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.402597 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/365d0573-86d9-4320-b159-8d10af9c6b7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.403408 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/365d0573-86d9-4320-b159-8d10af9c6b7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.409196 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365d0573-86d9-4320-b159-8d10af9c6b7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.421872 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/365d0573-86d9-4320-b159-8d10af9c6b7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4hsvr\" (UID: \"365d0573-86d9-4320-b159-8d10af9c6b7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.429067 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=82.429049851 podStartE2EDuration="1m22.429049851s" podCreationTimestamp="2026-03-18 15:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.427585459 +0000 UTC m=+171.026773090" watchObservedRunningTime="2026-03-18 15:40:06.429049851 +0000 UTC m=+171.028237472" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.440968 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=84.440951187 podStartE2EDuration="1m24.440951187s" podCreationTimestamp="2026-03-18 15:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:06.440175605 +0000 UTC m=+171.039363226" watchObservedRunningTime="2026-03-18 15:40:06.440951187 +0000 UTC m=+171.040138808" Mar 18 15:40:06 crc kubenswrapper[4939]: I0318 15:40:06.493445 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" Mar 18 15:40:06 crc kubenswrapper[4939]: W0318 15:40:06.515240 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365d0573_86d9_4320_b159_8d10af9c6b7f.slice/crio-e898d54d7761083c8a97e16ea84a71e7bc86823959333c0f9f38d5c57852d232 WatchSource:0}: Error finding container e898d54d7761083c8a97e16ea84a71e7bc86823959333c0f9f38d5c57852d232: Status 404 returned error can't find the container with id e898d54d7761083c8a97e16ea84a71e7bc86823959333c0f9f38d5c57852d232 Mar 18 15:40:07 crc kubenswrapper[4939]: I0318 15:40:07.048480 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" event={"ID":"365d0573-86d9-4320-b159-8d10af9c6b7f","Type":"ContainerStarted","Data":"1a1bc2de8418497aa6065b48149902fb86a9f391d97dc0fe6d8fd0871403100e"} Mar 18 15:40:07 crc kubenswrapper[4939]: I0318 15:40:07.048582 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" event={"ID":"365d0573-86d9-4320-b159-8d10af9c6b7f","Type":"ContainerStarted","Data":"e898d54d7761083c8a97e16ea84a71e7bc86823959333c0f9f38d5c57852d232"} Mar 18 15:40:07 crc kubenswrapper[4939]: I0318 15:40:07.068554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4hsvr" podStartSLOduration=111.068532349 podStartE2EDuration="1m51.068532349s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:07.064078659 +0000 UTC m=+171.663266280" watchObservedRunningTime="2026-03-18 15:40:07.068532349 +0000 UTC m=+171.667719970" Mar 18 15:40:08 crc kubenswrapper[4939]: I0318 15:40:08.132783 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:08 crc kubenswrapper[4939]: I0318 15:40:08.132784 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:08 crc kubenswrapper[4939]: E0318 15:40:08.133181 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:08 crc kubenswrapper[4939]: I0318 15:40:08.132941 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:08 crc kubenswrapper[4939]: I0318 15:40:08.132887 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:08 crc kubenswrapper[4939]: E0318 15:40:08.133491 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:08 crc kubenswrapper[4939]: E0318 15:40:08.133617 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:08 crc kubenswrapper[4939]: E0318 15:40:08.133683 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:10 crc kubenswrapper[4939]: I0318 15:40:10.132718 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:10 crc kubenswrapper[4939]: I0318 15:40:10.132781 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:10 crc kubenswrapper[4939]: I0318 15:40:10.132717 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:10 crc kubenswrapper[4939]: E0318 15:40:10.132886 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:10 crc kubenswrapper[4939]: I0318 15:40:10.132985 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:10 crc kubenswrapper[4939]: E0318 15:40:10.133112 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:10 crc kubenswrapper[4939]: E0318 15:40:10.133202 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:10 crc kubenswrapper[4939]: E0318 15:40:10.133295 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:11 crc kubenswrapper[4939]: E0318 15:40:11.263462 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:12 crc kubenswrapper[4939]: I0318 15:40:12.132597 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:12 crc kubenswrapper[4939]: I0318 15:40:12.132614 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:12 crc kubenswrapper[4939]: I0318 15:40:12.132688 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:12 crc kubenswrapper[4939]: I0318 15:40:12.133010 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.133242 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.133367 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.133489 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.133729 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:12 crc kubenswrapper[4939]: I0318 15:40:12.167560 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.167806 4939 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:40:12 crc kubenswrapper[4939]: E0318 15:40:12.167915 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs podName:4df63d3d-7b3a-46ad-a343-a25e1986fb5e nodeName:}" failed. No retries permitted until 2026-03-18 15:41:16.167881975 +0000 UTC m=+240.767069636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs") pod "network-metrics-daemon-zxrzw" (UID: "4df63d3d-7b3a-46ad-a343-a25e1986fb5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:40:14 crc kubenswrapper[4939]: I0318 15:40:14.133270 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:14 crc kubenswrapper[4939]: I0318 15:40:14.133326 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:14 crc kubenswrapper[4939]: E0318 15:40:14.133409 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:14 crc kubenswrapper[4939]: I0318 15:40:14.133270 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:14 crc kubenswrapper[4939]: I0318 15:40:14.133489 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:14 crc kubenswrapper[4939]: E0318 15:40:14.133561 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:14 crc kubenswrapper[4939]: E0318 15:40:14.133732 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:14 crc kubenswrapper[4939]: E0318 15:40:14.133973 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:16 crc kubenswrapper[4939]: I0318 15:40:16.132407 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:16 crc kubenswrapper[4939]: I0318 15:40:16.132411 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:16 crc kubenswrapper[4939]: I0318 15:40:16.132534 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:16 crc kubenswrapper[4939]: I0318 15:40:16.133311 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:16 crc kubenswrapper[4939]: E0318 15:40:16.133405 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:16 crc kubenswrapper[4939]: E0318 15:40:16.133682 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:16 crc kubenswrapper[4939]: E0318 15:40:16.133900 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:16 crc kubenswrapper[4939]: E0318 15:40:16.133924 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:16 crc kubenswrapper[4939]: E0318 15:40:16.264007 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:18 crc kubenswrapper[4939]: I0318 15:40:18.132410 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:18 crc kubenswrapper[4939]: E0318 15:40:18.132921 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:18 crc kubenswrapper[4939]: I0318 15:40:18.132715 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:18 crc kubenswrapper[4939]: E0318 15:40:18.133038 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:18 crc kubenswrapper[4939]: I0318 15:40:18.132755 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:18 crc kubenswrapper[4939]: E0318 15:40:18.133123 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:18 crc kubenswrapper[4939]: I0318 15:40:18.132540 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:18 crc kubenswrapper[4939]: E0318 15:40:18.133202 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:20 crc kubenswrapper[4939]: I0318 15:40:20.132334 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:20 crc kubenswrapper[4939]: I0318 15:40:20.132394 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:20 crc kubenswrapper[4939]: E0318 15:40:20.132497 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:20 crc kubenswrapper[4939]: I0318 15:40:20.132601 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:20 crc kubenswrapper[4939]: I0318 15:40:20.132680 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:20 crc kubenswrapper[4939]: E0318 15:40:20.132794 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:20 crc kubenswrapper[4939]: E0318 15:40:20.133307 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:20 crc kubenswrapper[4939]: E0318 15:40:20.133385 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:20 crc kubenswrapper[4939]: I0318 15:40:20.133590 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:40:20 crc kubenswrapper[4939]: E0318 15:40:20.133770 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l79pv_openshift-ovn-kubernetes(acafcc67-568f-415b-b907-c1de4c851fa7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" Mar 18 15:40:21 crc kubenswrapper[4939]: E0318 15:40:21.265389 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:22 crc kubenswrapper[4939]: I0318 15:40:22.132619 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:22 crc kubenswrapper[4939]: I0318 15:40:22.132668 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:22 crc kubenswrapper[4939]: I0318 15:40:22.132756 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:22 crc kubenswrapper[4939]: E0318 15:40:22.132815 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:22 crc kubenswrapper[4939]: I0318 15:40:22.132970 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:22 crc kubenswrapper[4939]: E0318 15:40:22.132959 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:22 crc kubenswrapper[4939]: E0318 15:40:22.133033 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:22 crc kubenswrapper[4939]: E0318 15:40:22.133113 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:24 crc kubenswrapper[4939]: I0318 15:40:24.132218 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:24 crc kubenswrapper[4939]: I0318 15:40:24.132430 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:24 crc kubenswrapper[4939]: I0318 15:40:24.132581 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:24 crc kubenswrapper[4939]: E0318 15:40:24.132469 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:24 crc kubenswrapper[4939]: I0318 15:40:24.132218 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:24 crc kubenswrapper[4939]: E0318 15:40:24.132695 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:24 crc kubenswrapper[4939]: E0318 15:40:24.132803 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:24 crc kubenswrapper[4939]: E0318 15:40:24.132905 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:26 crc kubenswrapper[4939]: I0318 15:40:26.132286 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:26 crc kubenswrapper[4939]: I0318 15:40:26.132366 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:26 crc kubenswrapper[4939]: E0318 15:40:26.134331 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:26 crc kubenswrapper[4939]: I0318 15:40:26.134386 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:26 crc kubenswrapper[4939]: E0318 15:40:26.134603 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:26 crc kubenswrapper[4939]: I0318 15:40:26.134675 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:26 crc kubenswrapper[4939]: E0318 15:40:26.134752 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:26 crc kubenswrapper[4939]: E0318 15:40:26.134908 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:26 crc kubenswrapper[4939]: E0318 15:40:26.266477 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.154200 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/1.log" Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.154683 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/0.log" Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.154727 4939 generic.go:334] "Generic (PLEG): container finished" podID="6693c593-9b18-435e-8a3a-91d3e33c3c51" containerID="afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e" exitCode=1 Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.154787 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerDied","Data":"afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e"} Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.154867 4939 scope.go:117] "RemoveContainer" containerID="428e56c11c25e025d14735cb5a38b52349e6b8340fdbb2bf2a40288dd72ae935" Mar 18 15:40:27 crc kubenswrapper[4939]: I0318 15:40:27.155472 4939 scope.go:117] "RemoveContainer" containerID="afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e" Mar 18 15:40:27 crc kubenswrapper[4939]: E0318 15:40:27.155784 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xmzwg_openshift-multus(6693c593-9b18-435e-8a3a-91d3e33c3c51)\"" pod="openshift-multus/multus-xmzwg" podUID="6693c593-9b18-435e-8a3a-91d3e33c3c51" Mar 18 15:40:28 crc kubenswrapper[4939]: I0318 15:40:28.132287 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:28 crc kubenswrapper[4939]: I0318 15:40:28.132380 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:28 crc kubenswrapper[4939]: I0318 15:40:28.132381 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:28 crc kubenswrapper[4939]: E0318 15:40:28.132526 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:28 crc kubenswrapper[4939]: I0318 15:40:28.132544 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:28 crc kubenswrapper[4939]: E0318 15:40:28.132642 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:28 crc kubenswrapper[4939]: E0318 15:40:28.132785 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:28 crc kubenswrapper[4939]: E0318 15:40:28.132957 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:28 crc kubenswrapper[4939]: I0318 15:40:28.161146 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/1.log" Mar 18 15:40:30 crc kubenswrapper[4939]: I0318 15:40:30.133138 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:30 crc kubenswrapper[4939]: I0318 15:40:30.133803 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:30 crc kubenswrapper[4939]: I0318 15:40:30.134041 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:30 crc kubenswrapper[4939]: E0318 15:40:30.134034 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:30 crc kubenswrapper[4939]: I0318 15:40:30.134123 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:30 crc kubenswrapper[4939]: E0318 15:40:30.134281 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:30 crc kubenswrapper[4939]: E0318 15:40:30.134431 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:30 crc kubenswrapper[4939]: E0318 15:40:30.134650 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:31 crc kubenswrapper[4939]: I0318 15:40:31.133796 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:40:31 crc kubenswrapper[4939]: E0318 15:40:31.268055 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.044404 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zxrzw"] Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.044531 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:32 crc kubenswrapper[4939]: E0318 15:40:32.044629 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.132467 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.132535 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:32 crc kubenswrapper[4939]: E0318 15:40:32.132680 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:32 crc kubenswrapper[4939]: E0318 15:40:32.132887 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.132920 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:32 crc kubenswrapper[4939]: E0318 15:40:32.133144 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.186747 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/3.log" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.189497 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerStarted","Data":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.190067 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:40:32 crc kubenswrapper[4939]: I0318 15:40:32.251146 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podStartSLOduration=136.251122106 podStartE2EDuration="2m16.251122106s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:32.24678609 +0000 UTC m=+196.845973751" watchObservedRunningTime="2026-03-18 15:40:32.251122106 +0000 UTC m=+196.850309757" Mar 18 15:40:33 crc kubenswrapper[4939]: I0318 15:40:33.132119 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:33 crc kubenswrapper[4939]: E0318 15:40:33.132254 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:34 crc kubenswrapper[4939]: I0318 15:40:34.133416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:34 crc kubenswrapper[4939]: I0318 15:40:34.133759 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:34 crc kubenswrapper[4939]: E0318 15:40:34.133820 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:34 crc kubenswrapper[4939]: I0318 15:40:34.133717 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:34 crc kubenswrapper[4939]: E0318 15:40:34.134046 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:34 crc kubenswrapper[4939]: E0318 15:40:34.134160 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:35 crc kubenswrapper[4939]: I0318 15:40:35.132471 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:35 crc kubenswrapper[4939]: E0318 15:40:35.132743 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:36 crc kubenswrapper[4939]: I0318 15:40:36.132304 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:36 crc kubenswrapper[4939]: I0318 15:40:36.132418 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:36 crc kubenswrapper[4939]: I0318 15:40:36.132412 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:36 crc kubenswrapper[4939]: E0318 15:40:36.135293 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:36 crc kubenswrapper[4939]: E0318 15:40:36.135439 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:36 crc kubenswrapper[4939]: E0318 15:40:36.135627 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:36 crc kubenswrapper[4939]: E0318 15:40:36.269101 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:37 crc kubenswrapper[4939]: I0318 15:40:37.132200 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:37 crc kubenswrapper[4939]: E0318 15:40:37.132407 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:38 crc kubenswrapper[4939]: I0318 15:40:38.133169 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:38 crc kubenswrapper[4939]: I0318 15:40:38.133226 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:38 crc kubenswrapper[4939]: I0318 15:40:38.133167 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:38 crc kubenswrapper[4939]: E0318 15:40:38.133348 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:38 crc kubenswrapper[4939]: E0318 15:40:38.133594 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:38 crc kubenswrapper[4939]: E0318 15:40:38.133809 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:39 crc kubenswrapper[4939]: I0318 15:40:39.132912 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:39 crc kubenswrapper[4939]: E0318 15:40:39.133075 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.133239 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.133417 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.133661 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.133920 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.134344 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.134453 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.163851 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.164039 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:42:42.163995958 +0000 UTC m=+326.763183619 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.265928 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.266029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.266105 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:40 crc kubenswrapper[4939]: I0318 15:40:40.266169 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266293 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266340 4939 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266369 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266395 4939 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266394 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266433 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:42:42.266400945 +0000 UTC m=+326.865588606 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266438 4939 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266292 4939 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266464 4939 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266526 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:42:42.266461747 +0000 UTC m=+326.865649548 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266567 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:42:42.266543709 +0000 UTC m=+326.865731370 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:40:40 crc kubenswrapper[4939]: E0318 15:40:40.266631 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:42:42.266606991 +0000 UTC m=+326.865794662 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:40:41 crc kubenswrapper[4939]: I0318 15:40:41.132137 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:41 crc kubenswrapper[4939]: E0318 15:40:41.132391 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:41 crc kubenswrapper[4939]: E0318 15:40:41.271013 4939 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:40:42 crc kubenswrapper[4939]: I0318 15:40:42.133201 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:42 crc kubenswrapper[4939]: I0318 15:40:42.133354 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:42 crc kubenswrapper[4939]: I0318 15:40:42.133458 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:42 crc kubenswrapper[4939]: E0318 15:40:42.133694 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:42 crc kubenswrapper[4939]: E0318 15:40:42.133829 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:42 crc kubenswrapper[4939]: I0318 15:40:42.133953 4939 scope.go:117] "RemoveContainer" containerID="afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e" Mar 18 15:40:42 crc kubenswrapper[4939]: E0318 15:40:42.134120 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:43 crc kubenswrapper[4939]: I0318 15:40:43.132974 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:43 crc kubenswrapper[4939]: E0318 15:40:43.133214 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:43 crc kubenswrapper[4939]: I0318 15:40:43.242348 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/1.log" Mar 18 15:40:43 crc kubenswrapper[4939]: I0318 15:40:43.242478 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerStarted","Data":"8c43b687c4784f3290b1295a4c03e01915b1b5c21440cdee5e547995c7fc926a"} Mar 18 15:40:44 crc kubenswrapper[4939]: I0318 15:40:44.133253 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:44 crc kubenswrapper[4939]: I0318 15:40:44.133306 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:44 crc kubenswrapper[4939]: E0318 15:40:44.133448 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:44 crc kubenswrapper[4939]: I0318 15:40:44.133465 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:44 crc kubenswrapper[4939]: E0318 15:40:44.133791 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:44 crc kubenswrapper[4939]: E0318 15:40:44.133876 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:45 crc kubenswrapper[4939]: I0318 15:40:45.132394 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:45 crc kubenswrapper[4939]: E0318 15:40:45.133209 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zxrzw" podUID="4df63d3d-7b3a-46ad-a343-a25e1986fb5e" Mar 18 15:40:46 crc kubenswrapper[4939]: I0318 15:40:46.132955 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:46 crc kubenswrapper[4939]: I0318 15:40:46.133060 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:46 crc kubenswrapper[4939]: E0318 15:40:46.134853 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:40:46 crc kubenswrapper[4939]: I0318 15:40:46.134881 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:46 crc kubenswrapper[4939]: E0318 15:40:46.134968 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:40:46 crc kubenswrapper[4939]: E0318 15:40:46.135070 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.132761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.137626 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.137652 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.269360 4939 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.315038 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.315950 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.317573 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5lh2v"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.318295 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.318947 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.319612 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.321664 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.321802 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.322309 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.323133 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.323794 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.324390 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.328269 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.328778 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.332093 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.332986 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.334147 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.335266 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.335552 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.336379 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.336635 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.336690 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.336816 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.336958 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.337320 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.337494 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.341394 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.342052 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.346213 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txf85"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.347610 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.348133 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.348320 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.352036 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.352136 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.352321 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.352429 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.352687 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.355161 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffvz"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.355191 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.355593 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.355816 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.356381 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.356612 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.356867 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358017 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358122 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358120 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358188 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358234 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358139 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.358264 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvzj\" (UniqueName: \"kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.360609 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.361003 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.361054 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.361183 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.361673 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dcfh2"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.362121 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.362557 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.367066 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.367313 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.367479 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.377988 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.381618 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.382235 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.387197 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.388161 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.418931 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2lc49"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.419467 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.419740 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.421562 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.422617 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.422746 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.422800 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.422855 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.422918 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.423800 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.424192 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.424408 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.424491 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.425299 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.425450 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.425655 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.425829 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.428553 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.429212 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.431642 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432009 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432243 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432481 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432650 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432795 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.432963 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.438387 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7gmkn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.438712 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.438910 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.439247 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.439355 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.439382 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vqqr2"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.439610 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.439797 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.440407 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.440416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.440607 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.442315 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.442586 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.443100 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.443368 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.443548 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.444763 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.445367 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.445572 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.445934 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.451093 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.458827 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvzj\" (UniqueName: \"kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.458862 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.458898 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.458947 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.462616 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.463671 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.463860 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.464671 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.465033 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.465777 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.465997 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.466184 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.465784 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.465871 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.466551 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.467010 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.468191 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.468332 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.468587 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.469359 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.470870 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.470993 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.471620 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.471794 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.471938 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472030 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472110 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472275 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472332 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472368 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472382 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472419 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472277 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472544 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472574 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472591 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472607 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472606 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472907 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472659 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.472694 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.473277 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.474476 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.474692 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.486624 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9255f"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.487962 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.491462 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.493428 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.498099 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.511854 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.513126 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.513813 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.514534 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.516064 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.516094 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.516183 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.516468 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.519233 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.520332 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4pxp"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.521088 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.521616 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9qhf4"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.522029 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.525970 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.532518 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5lh2v"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.532564 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.533222 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.533445 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.534037 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.534648 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.534945 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.537015 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.537388 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.537482 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.538205 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.539281 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vttdd"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540017 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540228 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540601 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540781 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540804 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.540936 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dz6vg"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541168 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541311 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541534 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541561 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541935 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.541994 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.542145 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.543279 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txf85"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.543718 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.545969 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dcfh2"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.546045 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7rvrq"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.546929 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.547014 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.548317 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffvz"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.553414 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2lc49"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.553456 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.557298 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vqqr2"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.557864 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.558940 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.559522 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.560562 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.562634 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.562943 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.568009 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.570424 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.572323 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.575946 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.577099 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.578405 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.578863 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.580148 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9255f"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.581282 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.583581 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.584927 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-49228"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.586033 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-49228" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.588580 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fdwn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.590388 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.590532 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.591211 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.592381 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7gmkn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.593558 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.594563 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vttdd"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.595784 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4pxp"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.596793 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.597831 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.598122 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dz6vg"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.599603 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.601128 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-49228"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.602168 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.603967 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.605129 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fdwn"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.606298 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.607364 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pt54n"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.608463 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pt54n"] Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.608667 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.619672 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.639416 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.658566 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.678921 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.758815 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.761416 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvzj\" (UniqueName: \"kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj\") pod \"route-controller-manager-6576b87f9c-8jwp6\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.778702 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.798097 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.819366 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.843199 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.858093 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.878469 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.899451 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.918748 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.939707 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.954273 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.959143 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.985722 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:40:47 crc kubenswrapper[4939]: I0318 15:40:47.999086 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.018225 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.038488 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.058463 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.078559 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.100615 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.119225 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.132804 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.132846 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.133170 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.138992 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.158679 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.177427 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.177876 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: W0318 15:40:48.192692 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afb8cbc_8a92_4ee3_9541_ff4d2faa0976.slice/crio-4c922e95f654a368eeae2e9797d487893d430a61e39018e8a09a7ce75a901622 WatchSource:0}: Error finding container 4c922e95f654a368eeae2e9797d487893d430a61e39018e8a09a7ce75a901622: Status 404 returned error can't find the container with id 4c922e95f654a368eeae2e9797d487893d430a61e39018e8a09a7ce75a901622 Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.199335 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.219600 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.239813 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.259127 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.262554 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" event={"ID":"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976","Type":"ContainerStarted","Data":"4c922e95f654a368eeae2e9797d487893d430a61e39018e8a09a7ce75a901622"} Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.280178 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.299193 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.319980 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.339106 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.360088 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.379991 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.399151 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.419784 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.439566 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.459447 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.479645 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.499624 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.519809 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.540324 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.557021 4939 request.go:700] Waited for 1.019639866s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-stats-default&limit=500&resourceVersion=0 Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.559931 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.579373 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.599028 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.619101 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.639895 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.660179 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.678849 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.699800 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.719788 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.739340 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.759640 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.779755 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.799656 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.820671 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.849827 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.858938 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.880267 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.899837 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.919121 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.939283 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.959542 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:40:48 crc kubenswrapper[4939]: I0318 15:40:48.979667 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.000025 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.017848 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.038192 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.059022 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.080145 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.100325 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.119076 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.139768 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.159445 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.178483 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.199745 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.219137 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.239649 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.259209 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.267288 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" event={"ID":"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976","Type":"ContainerStarted","Data":"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7"} Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.267654 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.279624 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.299627 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.319142 4939 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.338056 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.338389 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.360360 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.378945 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.399162 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.420107 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.479934 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485246 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-config\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485284 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/134347ee-9f68-45f7-b66c-dc2493eac221-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485320 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485357 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51e9250d-312d-4e6b-9a21-bfe25d4533ff-metrics-tls\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485408 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-machine-approver-tls\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485437 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2kq\" (UniqueName: \"kubernetes.io/projected/4306eb36-80d5-404b-8909-dd446ee88230-kube-api-access-4q2kq\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485472 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-etcd-client\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485531 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-serving-cert\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485571 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b356cad2-134f-4910-875f-71c38fda3cee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485609 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485641 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsq5n\" (UniqueName: \"kubernetes.io/projected/e65663eb-5c53-4a9f-81d7-6356a33dc7b7-kube-api-access-nsq5n\") pod \"downloads-7954f5f757-7gmkn\" (UID: \"e65663eb-5c53-4a9f-81d7-6356a33dc7b7\") " pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485705 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485754 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc2961e-26df-420a-9e5b-8abd43b85b2a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485834 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dpx\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485930 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b97b6e8-80c7-4467-a4d1-9e4848ace365-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.485994 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e375e2a-ef39-4b07-aa49-1a498266a487-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486026 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-audit-policies\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486064 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a286474-0f76-4707-b476-0e30a49fcc32-config\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1757e629-858a-4c03-8a61-119198d1dc83-audit-dir\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486125 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-trusted-ca\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486157 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42cj\" (UniqueName: \"kubernetes.io/projected/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-kube-api-access-q42cj\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486215 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqtr\" (UniqueName: \"kubernetes.io/projected/7e375e2a-ef39-4b07-aa49-1a498266a487-kube-api-access-9jqtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486252 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9bg\" (UniqueName: \"kubernetes.io/projected/3dc2961e-26df-420a-9e5b-8abd43b85b2a-kube-api-access-rm9bg\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486284 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486314 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486355 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b97b6e8-80c7-4467-a4d1-9e4848ace365-config\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486388 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dc2961e-26df-420a-9e5b-8abd43b85b2a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486410 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486456 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e375e2a-ef39-4b07-aa49-1a498266a487-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-config\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486498 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lwss\" (UniqueName: \"kubernetes.io/projected/134347ee-9f68-45f7-b66c-dc2493eac221-kube-api-access-9lwss\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486547 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486583 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486613 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512c9f47-351e-4c5a-9119-f25a3500fc6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486635 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486672 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486692 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486712 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfm2\" (UniqueName: \"kubernetes.io/projected/1757e629-858a-4c03-8a61-119198d1dc83-kube-api-access-hxfm2\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486735 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-node-pullsecrets\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486790 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.486883 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ldh4\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-kube-api-access-2ldh4\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487047 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-serving-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487122 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-images\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487156 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487181 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487201 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-client\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487221 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487240 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487263 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487293 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/512c9f47-351e-4c5a-9119-f25a3500fc6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487322 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfbb\" (UniqueName: \"kubernetes.io/projected/512c9f47-351e-4c5a-9119-f25a3500fc6e-kube-api-access-vqfbb\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487349 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267sb\" (UniqueName: \"kubernetes.io/projected/51e9250d-312d-4e6b-9a21-bfe25d4533ff-kube-api-access-267sb\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487372 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b97b6e8-80c7-4467-a4d1-9e4848ace365-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487393 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpsc\" (UniqueName: \"kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487410 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hqb\" (UniqueName: \"kubernetes.io/projected/366ffb79-635d-4f44-b22f-3fd5a77bd022-kube-api-access-f9hqb\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487442 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbr6\" (UniqueName: \"kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6\") pod \"auto-csr-approver-29564140-vqqr2\" (UID: \"330c585e-3a67-4502-b800-7401df959334\") " pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487459 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-serving-cert\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487476 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmz4v\" (UniqueName: \"kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487492 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-image-import-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487547 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487571 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487597 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487619 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487635 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjtv\" (UniqueName: \"kubernetes.io/projected/88bed82e-aed3-4f18-9908-f94900c9e60d-kube-api-access-7mjtv\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487655 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-auth-proxy-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487676 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a286474-0f76-4707-b476-0e30a49fcc32-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hch6x\" (UniqueName: \"kubernetes.io/projected/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-kube-api-access-hch6x\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487722 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a286474-0f76-4707-b476-0e30a49fcc32-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487737 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-config\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487768 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487786 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/366ffb79-635d-4f44-b22f-3fd5a77bd022-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487804 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-encryption-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487836 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgw4x\" (UniqueName: \"kubernetes.io/projected/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-kube-api-access-pgw4x\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487851 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487867 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b356cad2-134f-4910-875f-71c38fda3cee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487882 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-serving-cert\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.487916 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:49.987901015 +0000 UTC m=+214.587088756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487976 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.487997 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488015 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488035 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88bed82e-aed3-4f18-9908-f94900c9e60d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488087 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306eb36-80d5-404b-8909-dd446ee88230-serving-cert\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488113 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488138 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit-dir\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488178 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488203 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488225 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.488266 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-encryption-config\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.497719 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.519203 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.538570 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589415 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589583 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45eb9355-906a-43cb-880b-b7790e7ef4f2-config\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589609 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpsc\" (UniqueName: \"kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.589637 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.089606222 +0000 UTC m=+214.688793853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589684 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hqb\" (UniqueName: \"kubernetes.io/projected/366ffb79-635d-4f44-b22f-3fd5a77bd022-kube-api-access-f9hqb\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589726 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-serving-cert\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589759 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g467\" (UniqueName: \"kubernetes.io/projected/8f070ed7-44cb-4a09-9f68-0efde6f58169-kube-api-access-5g467\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589789 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589814 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589837 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z2cx\" (UniqueName: \"kubernetes.io/projected/45eb9355-906a-43cb-880b-b7790e7ef4f2-kube-api-access-5z2cx\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589863 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589886 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589910 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjtv\" (UniqueName: \"kubernetes.io/projected/88bed82e-aed3-4f18-9908-f94900c9e60d-kube-api-access-7mjtv\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589933 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589955 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl2f\" (UniqueName: \"kubernetes.io/projected/45453fff-856e-44f4-b05d-d00a91d00429-kube-api-access-xkl2f\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.589979 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-auth-proxy-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590001 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-plugins-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590025 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hch6x\" (UniqueName: \"kubernetes.io/projected/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-kube-api-access-hch6x\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590050 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a286474-0f76-4707-b476-0e30a49fcc32-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590072 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7s4\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-kube-api-access-ml7s4\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590120 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzm4s\" (UniqueName: \"kubernetes.io/projected/03bc0ce6-d425-4864-883a-fdc9d6e2a460-kube-api-access-mzm4s\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590140 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-config\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-encryption-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590217 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590241 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b356cad2-134f-4910-875f-71c38fda3cee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590262 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-serving-cert\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590285 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590308 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgw4x\" (UniqueName: \"kubernetes.io/projected/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-kube-api-access-pgw4x\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590331 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlfv\" (UniqueName: \"kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590356 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590382 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590405 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590429 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306eb36-80d5-404b-8909-dd446ee88230-serving-cert\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590451 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665daba4-aaad-4285-9bfb-c58983b35d2a-service-ca-bundle\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590523 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.590576 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.0905635 +0000 UTC m=+214.689751121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590598 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590621 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68wbh\" (UniqueName: \"kubernetes.io/projected/2adddc7a-6b85-45dc-abf2-611a810581ad-kube-api-access-68wbh\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590640 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-config\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590656 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590672 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51e9250d-312d-4e6b-9a21-bfe25d4533ff-metrics-tls\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590690 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef08b3ba-1294-43dd-af5b-98550ede648f-proxy-tls\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590710 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2kq\" (UniqueName: \"kubernetes.io/projected/4306eb36-80d5-404b-8909-dd446ee88230-kube-api-access-4q2kq\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590726 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b356cad2-134f-4910-875f-71c38fda3cee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590742 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590757 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-certs\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590771 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe66d52c-4339-4661-ac23-8ab2e68e22cc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590786 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2599\" (UniqueName: \"kubernetes.io/projected/665daba4-aaad-4285-9bfb-c58983b35d2a-kube-api-access-n2599\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590802 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590818 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc2961e-26df-420a-9e5b-8abd43b85b2a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590850 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590866 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-audit-policies\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590884 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-apiservice-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590901 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-node-bootstrap-token\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dpx\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590968 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1757e629-858a-4c03-8a61-119198d1dc83-audit-dir\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.590988 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqtr\" (UniqueName: \"kubernetes.io/projected/7e375e2a-ef39-4b07-aa49-1a498266a487-kube-api-access-9jqtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591007 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591025 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k599h\" (UniqueName: \"kubernetes.io/projected/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-kube-api-access-k599h\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591041 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591056 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lwss\" (UniqueName: \"kubernetes.io/projected/134347ee-9f68-45f7-b66c-dc2493eac221-kube-api-access-9lwss\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591071 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-mountpoint-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591086 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591104 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591129 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591145 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-node-pullsecrets\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591163 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gttnn\" (UniqueName: \"kubernetes.io/projected/6afcb621-c97d-4bc2-a310-3eb43f3508b0-kube-api-access-gttnn\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591179 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591198 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-serving-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591214 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f070ed7-44cb-4a09-9f68-0efde6f58169-tmpfs\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591256 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-images\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591272 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591286 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-client\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591303 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/512c9f47-351e-4c5a-9119-f25a3500fc6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591320 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfbb\" (UniqueName: \"kubernetes.io/projected/512c9f47-351e-4c5a-9119-f25a3500fc6e-kube-api-access-vqfbb\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591363 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591466 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591490 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8gqv\" (UniqueName: \"kubernetes.io/projected/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-kube-api-access-c8gqv\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591525 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-csi-data-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591547 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbr6\" (UniqueName: \"kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6\") pod \"auto-csr-approver-29564140-vqqr2\" (UID: \"330c585e-3a67-4502-b800-7401df959334\") " pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591565 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmz4v\" (UniqueName: \"kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591579 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-image-import-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591601 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvcb\" (UniqueName: \"kubernetes.io/projected/54ea90ec-9918-4a13-ae59-b986b0f06c66-kube-api-access-prvcb\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591624 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591646 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-client\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591664 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdq24\" (UniqueName: \"kubernetes.io/projected/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-kube-api-access-cdq24\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591682 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a286474-0f76-4707-b476-0e30a49fcc32-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591698 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45eb9355-906a-43cb-880b-b7790e7ef4f2-serving-cert\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591714 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-config\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591729 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef08b3ba-1294-43dd-af5b-98550ede648f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591744 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpzz\" (UniqueName: \"kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591759 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-serving-cert\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591787 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/366ffb79-635d-4f44-b22f-3fd5a77bd022-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591819 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-default-certificate\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591834 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591849 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7dh\" (UniqueName: \"kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591865 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591886 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-images\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591908 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ea90ec-9918-4a13-ae59-b986b0f06c66-proxy-tls\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591926 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88bed82e-aed3-4f18-9908-f94900c9e60d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591942 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45453fff-856e-44f4-b05d-d00a91d00429-config-volume\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591956 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-service-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591973 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.591989 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592006 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit-dir\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592021 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592023 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-node-pullsecrets\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592037 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-encryption-config\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592054 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/134347ee-9f68-45f7-b66c-dc2493eac221-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.592112 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-webhook-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593046 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593165 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593357 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593543 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-audit-policies\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593642 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1757e629-858a-4c03-8a61-119198d1dc83-audit-dir\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.593672 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.594282 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-auth-proxy-config\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.594621 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.595412 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-config\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596119 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596177 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnp8x\" (UniqueName: \"kubernetes.io/projected/34955821-9f88-452b-b5c9-f86a93e3f427-kube-api-access-lnp8x\") pod \"migrator-59844c95c7-sv5w4\" (UID: \"34955821-9f88-452b-b5c9-f86a93e3f427\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596234 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596272 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-machine-approver-tls\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596305 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-etcd-client\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596344 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-serving-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596357 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-serving-cert\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596391 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1e7ea15-da6a-42c4-b136-34291813a34c-cert\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596636 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b356cad2-134f-4910-875f-71c38fda3cee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.596824 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.597437 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1757e629-858a-4c03-8a61-119198d1dc83-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.597874 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-config\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.598361 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc2961e-26df-420a-9e5b-8abd43b85b2a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.598662 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/366ffb79-635d-4f44-b22f-3fd5a77bd022-images\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.598935 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.598990 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit-dir\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.599587 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600629 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.599973 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600126 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a286474-0f76-4707-b476-0e30a49fcc32-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600682 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600271 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-image-import-ca\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600380 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600603 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600724 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsq5n\" (UniqueName: \"kubernetes.io/projected/e65663eb-5c53-4a9f-81d7-6356a33dc7b7-kube-api-access-nsq5n\") pod \"downloads-7954f5f757-7gmkn\" (UID: \"e65663eb-5c53-4a9f-81d7-6356a33dc7b7\") " pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600295 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600762 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.599867 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/134347ee-9f68-45f7-b66c-dc2493eac221-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600906 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-cabundle\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.600952 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601030 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e375e2a-ef39-4b07-aa49-1a498266a487-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601090 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601089 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/512c9f47-351e-4c5a-9119-f25a3500fc6e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601135 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtlbc\" (UniqueName: \"kubernetes.io/projected/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-kube-api-access-vtlbc\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601201 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b97b6e8-80c7-4467-a4d1-9e4848ace365-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601363 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a286474-0f76-4707-b476-0e30a49fcc32-config\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601416 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-socket-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601442 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601523 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42cj\" (UniqueName: \"kubernetes.io/projected/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-kube-api-access-q42cj\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601559 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601622 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-srv-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601652 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-trusted-ca\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601679 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9bg\" (UniqueName: \"kubernetes.io/projected/3dc2961e-26df-420a-9e5b-8abd43b85b2a-kube-api-access-rm9bg\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601736 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-registration-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601761 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe66d52c-4339-4661-ac23-8ab2e68e22cc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601790 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dc2961e-26df-420a-9e5b-8abd43b85b2a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601814 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-key\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601839 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-metrics-certs\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601848 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e375e2a-ef39-4b07-aa49-1a498266a487-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601894 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45453fff-856e-44f4-b05d-d00a91d00429-metrics-tls\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601909 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601933 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe66d52c-4339-4661-ac23-8ab2e68e22cc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.601916 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a286474-0f76-4707-b476-0e30a49fcc32-config\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b97b6e8-80c7-4467-a4d1-9e4848ace365-config\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602108 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e375e2a-ef39-4b07-aa49-1a498266a487-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602171 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-config\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602268 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602358 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndgp\" (UniqueName: \"kubernetes.io/projected/ef08b3ba-1294-43dd-af5b-98550ede648f-kube-api-access-2ndgp\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602405 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602485 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b97b6e8-80c7-4467-a4d1-9e4848ace365-config\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602470 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512c9f47-351e-4c5a-9119-f25a3500fc6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602666 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l98g5\" (UniqueName: \"kubernetes.io/projected/b1e7ea15-da6a-42c4-b136-34291813a34c-kube-api-access-l98g5\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602712 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-stats-auth\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602675 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4306eb36-80d5-404b-8909-dd446ee88230-trusted-ca\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602809 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p8dd\" (UniqueName: \"kubernetes.io/projected/67487f25-053f-451f-bbcb-a9487d58802e-kube-api-access-2p8dd\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602845 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.602934 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfm2\" (UniqueName: \"kubernetes.io/projected/1757e629-858a-4c03-8a61-119198d1dc83-kube-api-access-hxfm2\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603096 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603143 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ldh4\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-kube-api-access-2ldh4\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603180 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-config\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603184 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603359 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-srv-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603920 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/03bc0ce6-d425-4864-883a-fdc9d6e2a460-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.603973 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604010 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604050 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604114 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604141 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b97b6e8-80c7-4467-a4d1-9e4848ace365-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604702 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-audit\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604733 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604907 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.604957 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-267sb\" (UniqueName: \"kubernetes.io/projected/51e9250d-312d-4e6b-9a21-bfe25d4533ff-kube-api-access-267sb\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.605130 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.605185 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngnsc\" (UniqueName: \"kubernetes.io/projected/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-kube-api-access-ngnsc\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.605598 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.606262 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607148 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/366ffb79-635d-4f44-b22f-3fd5a77bd022-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607154 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/512c9f47-351e-4c5a-9119-f25a3500fc6e-serving-cert\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607209 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607676 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-serving-cert\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607755 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4306eb36-80d5-404b-8909-dd446ee88230-serving-cert\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.607822 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e375e2a-ef39-4b07-aa49-1a498266a487-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608201 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608223 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b356cad2-134f-4910-875f-71c38fda3cee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608305 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-serving-cert\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608384 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51e9250d-312d-4e6b-9a21-bfe25d4533ff-metrics-tls\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608781 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608799 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608849 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-etcd-client\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.608870 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-encryption-config\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.609527 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-serving-cert\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.609747 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dc2961e-26df-420a-9e5b-8abd43b85b2a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.609960 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88bed82e-aed3-4f18-9908-f94900c9e60d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.610603 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1757e629-858a-4c03-8a61-119198d1dc83-encryption-config\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.610670 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-machine-approver-tls\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.610729 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-etcd-client\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.613489 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.614392 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.621206 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b97b6e8-80c7-4467-a4d1-9e4848ace365-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.637487 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpsc\" (UniqueName: \"kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc\") pod \"oauth-openshift-558db77b4-9pp46\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.658046 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hqb\" (UniqueName: \"kubernetes.io/projected/366ffb79-635d-4f44-b22f-3fd5a77bd022-kube-api-access-f9hqb\") pod \"machine-api-operator-5694c8668f-5lh2v\" (UID: \"366ffb79-635d-4f44-b22f-3fd5a77bd022\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.678776 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hch6x\" (UniqueName: \"kubernetes.io/projected/d56ba309-f93a-4b35-8f2b-a0f7fe561fc8-kube-api-access-hch6x\") pod \"apiserver-76f77b778f-txf85\" (UID: \"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8\") " pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.681632 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.699064 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a286474-0f76-4707-b476-0e30a49fcc32-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gjhzf\" (UID: \"6a286474-0f76-4707-b476-0e30a49fcc32\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706286 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706401 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef08b3ba-1294-43dd-af5b-98550ede648f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpzz\" (UniqueName: \"kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706447 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-serving-cert\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706469 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706486 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-default-certificate\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706518 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706532 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt7dh\" (UniqueName: \"kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-images\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706572 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ea90ec-9918-4a13-ae59-b986b0f06c66-proxy-tls\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706587 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45453fff-856e-44f4-b05d-d00a91d00429-config-volume\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-service-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706621 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-webhook-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706639 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnp8x\" (UniqueName: \"kubernetes.io/projected/34955821-9f88-452b-b5c9-f86a93e3f427-kube-api-access-lnp8x\") pod \"migrator-59844c95c7-sv5w4\" (UID: \"34955821-9f88-452b-b5c9-f86a93e3f427\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706664 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1e7ea15-da6a-42c4-b136-34291813a34c-cert\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706688 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-cabundle\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706703 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706720 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706735 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtlbc\" (UniqueName: \"kubernetes.io/projected/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-kube-api-access-vtlbc\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706755 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-socket-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706771 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706794 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706809 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-srv-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706828 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-registration-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706842 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe66d52c-4339-4661-ac23-8ab2e68e22cc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706859 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-key\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706873 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-metrics-certs\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706891 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45453fff-856e-44f4-b05d-d00a91d00429-metrics-tls\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706907 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe66d52c-4339-4661-ac23-8ab2e68e22cc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706922 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndgp\" (UniqueName: \"kubernetes.io/projected/ef08b3ba-1294-43dd-af5b-98550ede648f-kube-api-access-2ndgp\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706939 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l98g5\" (UniqueName: \"kubernetes.io/projected/b1e7ea15-da6a-42c4-b136-34291813a34c-kube-api-access-l98g5\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706954 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-stats-auth\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.706972 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p8dd\" (UniqueName: \"kubernetes.io/projected/67487f25-053f-451f-bbcb-a9487d58802e-kube-api-access-2p8dd\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707008 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707045 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707060 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707076 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-srv-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/03bc0ce6-d425-4864-883a-fdc9d6e2a460-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707112 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707131 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707145 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngnsc\" (UniqueName: \"kubernetes.io/projected/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-kube-api-access-ngnsc\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707160 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45eb9355-906a-43cb-880b-b7790e7ef4f2-config\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707176 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g467\" (UniqueName: \"kubernetes.io/projected/8f070ed7-44cb-4a09-9f68-0efde6f58169-kube-api-access-5g467\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707203 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z2cx\" (UniqueName: \"kubernetes.io/projected/45eb9355-906a-43cb-880b-b7790e7ef4f2-kube-api-access-5z2cx\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707240 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707255 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl2f\" (UniqueName: \"kubernetes.io/projected/45453fff-856e-44f4-b05d-d00a91d00429-kube-api-access-xkl2f\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707271 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-plugins-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707289 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7s4\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-kube-api-access-ml7s4\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707305 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzm4s\" (UniqueName: \"kubernetes.io/projected/03bc0ce6-d425-4864-883a-fdc9d6e2a460-kube-api-access-mzm4s\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-config\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707338 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707379 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlfv\" (UniqueName: \"kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707395 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707410 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707426 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665daba4-aaad-4285-9bfb-c58983b35d2a-service-ca-bundle\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707441 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707457 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68wbh\" (UniqueName: \"kubernetes.io/projected/2adddc7a-6b85-45dc-abf2-611a810581ad-kube-api-access-68wbh\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707473 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef08b3ba-1294-43dd-af5b-98550ede648f-proxy-tls\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707495 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707524 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-certs\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe66d52c-4339-4661-ac23-8ab2e68e22cc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707555 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2599\" (UniqueName: \"kubernetes.io/projected/665daba4-aaad-4285-9bfb-c58983b35d2a-kube-api-access-n2599\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707574 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-apiservice-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707592 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-node-bootstrap-token\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707621 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k599h\" (UniqueName: \"kubernetes.io/projected/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-kube-api-access-k599h\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707643 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-mountpoint-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707664 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gttnn\" (UniqueName: \"kubernetes.io/projected/6afcb621-c97d-4bc2-a310-3eb43f3508b0-kube-api-access-gttnn\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707686 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f070ed7-44cb-4a09-9f68-0efde6f58169-tmpfs\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707701 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707733 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gqv\" (UniqueName: \"kubernetes.io/projected/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-kube-api-access-c8gqv\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707749 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-csi-data-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707776 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prvcb\" (UniqueName: \"kubernetes.io/projected/54ea90ec-9918-4a13-ae59-b986b0f06c66-kube-api-access-prvcb\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707792 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-client\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707808 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdq24\" (UniqueName: \"kubernetes.io/projected/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-kube-api-access-cdq24\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707823 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45eb9355-906a-43cb-880b-b7790e7ef4f2-serving-cert\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.707990 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-socket-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.708388 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.710355 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.21031322 +0000 UTC m=+214.809501011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.710700 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45eb9355-906a-43cb-880b-b7790e7ef4f2-serving-cert\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.711302 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.711414 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.712750 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-default-certificate\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.713450 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45453fff-856e-44f4-b05d-d00a91d00429-config-volume\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.713890 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665daba4-aaad-4285-9bfb-c58983b35d2a-service-ca-bundle\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.714058 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.714866 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef08b3ba-1294-43dd-af5b-98550ede648f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.715775 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-srv-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.715785 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ea90ec-9918-4a13-ae59-b986b0f06c66-proxy-tls\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.716075 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.716967 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-srv-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.717011 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.717204 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-cabundle\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.718532 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.718646 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-registration-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.718725 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-mountpoint-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.719274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.719582 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef08b3ba-1294-43dd-af5b-98550ede648f-proxy-tls\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.719602 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjtv\" (UniqueName: \"kubernetes.io/projected/88bed82e-aed3-4f18-9908-f94900c9e60d-kube-api-access-7mjtv\") pod \"cluster-samples-operator-665b6dd947-pvl56\" (UID: \"88bed82e-aed3-4f18-9908-f94900c9e60d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.719919 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-metrics-certs\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.720056 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f070ed7-44cb-4a09-9f68-0efde6f58169-tmpfs\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.720706 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-service-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.721142 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.721362 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-csi-data-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.721431 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.721886 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1e7ea15-da6a-42c4-b136-34291813a34c-cert\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.722051 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2adddc7a-6b85-45dc-abf2-611a810581ad-plugins-dir\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.722434 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-webhook-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.722496 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.722950 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe66d52c-4339-4661-ac23-8ab2e68e22cc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.723263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/665daba4-aaad-4285-9bfb-c58983b35d2a-stats-auth\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.723644 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-client\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.723683 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-config\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.723710 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.724023 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45453fff-856e-44f4-b05d-d00a91d00429-metrics-tls\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.724360 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe66d52c-4339-4661-ac23-8ab2e68e22cc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.724778 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-serving-cert\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.725053 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45eb9355-906a-43cb-880b-b7790e7ef4f2-config\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.725236 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.725787 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-signing-key\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.725817 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.725938 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.726186 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-etcd-ca\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.726623 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.726650 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ea90ec-9918-4a13-ae59-b986b0f06c66-images\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.726683 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-certs\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.726761 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.727808 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f070ed7-44cb-4a09-9f68-0efde6f58169-apiservice-cert\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.728153 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6afcb621-c97d-4bc2-a310-3eb43f3508b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.728857 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67487f25-053f-451f-bbcb-a9487d58802e-node-bootstrap-token\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.732080 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/03bc0ce6-d425-4864-883a-fdc9d6e2a460-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.732372 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.735923 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.763691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfbb\" (UniqueName: \"kubernetes.io/projected/512c9f47-351e-4c5a-9119-f25a3500fc6e-kube-api-access-vqfbb\") pod \"openshift-config-operator-7777fb866f-jd2sf\" (UID: \"512c9f47-351e-4c5a-9119-f25a3500fc6e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.779963 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dpx\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.802480 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lwss\" (UniqueName: \"kubernetes.io/projected/134347ee-9f68-45f7-b66c-dc2493eac221-kube-api-access-9lwss\") pod \"control-plane-machine-set-operator-78cbb6b69f-6fdjn\" (UID: \"134347ee-9f68-45f7-b66c-dc2493eac221\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.812717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.813140 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.313121699 +0000 UTC m=+214.912309390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.818176 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2kq\" (UniqueName: \"kubernetes.io/projected/4306eb36-80d5-404b-8909-dd446ee88230-kube-api-access-4q2kq\") pod \"console-operator-58897d9998-dcfh2\" (UID: \"4306eb36-80d5-404b-8909-dd446ee88230\") " pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.833932 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.836978 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqtr\" (UniqueName: \"kubernetes.io/projected/7e375e2a-ef39-4b07-aa49-1a498266a487-kube-api-access-9jqtr\") pod \"openshift-controller-manager-operator-756b6f6bc6-957qk\" (UID: \"7e375e2a-ef39-4b07-aa49-1a498266a487\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.853524 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.866482 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.877172 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbr6\" (UniqueName: \"kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6\") pod \"auto-csr-approver-29564140-vqqr2\" (UID: \"330c585e-3a67-4502-b800-7401df959334\") " pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.890977 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgw4x\" (UniqueName: \"kubernetes.io/projected/2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140-kube-api-access-pgw4x\") pod \"machine-approver-56656f9798-f7r46\" (UID: \"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:49 crc kubenswrapper[4939]: W0318 15:40:49.891661 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod604dd149_ccc8_492a_a624_fd3088ed3bab.slice/crio-51a71d8d9fb1736539716d873b82aa5675de17794f330fd01e41bd19b93c373f WatchSource:0}: Error finding container 51a71d8d9fb1736539716d873b82aa5675de17794f330fd01e41bd19b93c373f: Status 404 returned error can't find the container with id 51a71d8d9fb1736539716d873b82aa5675de17794f330fd01e41bd19b93c373f Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.912417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmz4v\" (UniqueName: \"kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v\") pod \"controller-manager-879f6c89f-z46b4\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.913183 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:49 crc kubenswrapper[4939]: E0318 15:40:49.913643 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.413631451 +0000 UTC m=+215.012819072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.923201 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.933463 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf"] Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.936960 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsq5n\" (UniqueName: \"kubernetes.io/projected/e65663eb-5c53-4a9f-81d7-6356a33dc7b7-kube-api-access-nsq5n\") pod \"downloads-7954f5f757-7gmkn\" (UID: \"e65663eb-5c53-4a9f-81d7-6356a33dc7b7\") " pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.937215 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:49 crc kubenswrapper[4939]: W0318 15:40:49.949085 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a286474_0f76_4707_b476_0e30a49fcc32.slice/crio-61fae04547dadc48b45e3f13fc61a67dd3e57d6477818c6e0ab789b4862d4d41 WatchSource:0}: Error finding container 61fae04547dadc48b45e3f13fc61a67dd3e57d6477818c6e0ab789b4862d4d41: Status 404 returned error can't find the container with id 61fae04547dadc48b45e3f13fc61a67dd3e57d6477818c6e0ab789b4862d4d41 Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.961027 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.975048 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b97b6e8-80c7-4467-a4d1-9e4848ace365-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjcgk\" (UID: \"1b97b6e8-80c7-4467-a4d1-9e4848ace365\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.993596 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9bg\" (UniqueName: \"kubernetes.io/projected/3dc2961e-26df-420a-9e5b-8abd43b85b2a-kube-api-access-rm9bg\") pod \"openshift-apiserver-operator-796bbdcf4f-7mf68\" (UID: \"3dc2961e-26df-420a-9e5b-8abd43b85b2a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:49 crc kubenswrapper[4939]: I0318 15:40:49.996489 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.006729 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.009264 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5lh2v"] Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.011085 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.013692 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42cj\" (UniqueName: \"kubernetes.io/projected/b6b327b7-cf92-4bf3-9047-6c36ecf787b6-kube-api-access-q42cj\") pod \"authentication-operator-69f744f599-jffvz\" (UID: \"b6b327b7-cf92-4bf3-9047-6c36ecf787b6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.014531 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.014870 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.514853514 +0000 UTC m=+215.114041305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.018404 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.032059 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.033781 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfm2\" (UniqueName: \"kubernetes.io/projected/1757e629-858a-4c03-8a61-119198d1dc83-kube-api-access-hxfm2\") pod \"apiserver-7bbb656c7d-dbrdq\" (UID: \"1757e629-858a-4c03-8a61-119198d1dc83\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.055907 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.064936 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ldh4\" (UniqueName: \"kubernetes.io/projected/b356cad2-134f-4910-875f-71c38fda3cee-kube-api-access-2ldh4\") pod \"cluster-image-registry-operator-dc59b4c8b-dkkbz\" (UID: \"b356cad2-134f-4910-875f-71c38fda3cee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.074117 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-267sb\" (UniqueName: \"kubernetes.io/projected/51e9250d-312d-4e6b-9a21-bfe25d4533ff-kube-api-access-267sb\") pod \"dns-operator-744455d44c-2lc49\" (UID: \"51e9250d-312d-4e6b-9a21-bfe25d4533ff\") " pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.083884 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.111139 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.119141 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.120022 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.620006471 +0000 UTC m=+215.219194092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.124448 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpzz\" (UniqueName: \"kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz\") pod \"console-f9d7485db-nqdwc\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.145389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnp8x\" (UniqueName: \"kubernetes.io/projected/34955821-9f88-452b-b5c9-f86a93e3f427-kube-api-access-lnp8x\") pod \"migrator-59844c95c7-sv5w4\" (UID: \"34955821-9f88-452b-b5c9-f86a93e3f427\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.151287 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.154163 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt7dh\" (UniqueName: \"kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh\") pod \"collect-profiles-29564130-nt62d\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.164931 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.167469 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-txf85"] Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.173820 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl2f\" (UniqueName: \"kubernetes.io/projected/45453fff-856e-44f4-b05d-d00a91d00429-kube-api-access-xkl2f\") pod \"dns-default-49228\" (UID: \"45453fff-856e-44f4-b05d-d00a91d00429\") " pod="openshift-dns/dns-default-49228" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.186757 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.195127 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtlbc\" (UniqueName: \"kubernetes.io/projected/672c98cc-4cc7-4827-b7e2-be9dee6ba3de-kube-api-access-vtlbc\") pod \"olm-operator-6b444d44fb-2x4rk\" (UID: \"672c98cc-4cc7-4827-b7e2-be9dee6ba3de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.214453 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.225383 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.232382 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.233695 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.733674535 +0000 UTC m=+215.332862156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.235315 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf"] Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.243015 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68wbh\" (UniqueName: \"kubernetes.io/projected/2adddc7a-6b85-45dc-abf2-611a810581ad-kube-api-access-68wbh\") pod \"csi-hostpathplugin-5fdwn\" (UID: \"2adddc7a-6b85-45dc-abf2-611a810581ad\") " pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.254425 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.268322 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndgp\" (UniqueName: \"kubernetes.io/projected/ef08b3ba-1294-43dd-af5b-98550ede648f-kube-api-access-2ndgp\") pod \"machine-config-controller-84d6567774-xw29t\" (UID: \"ef08b3ba-1294-43dd-af5b-98550ede648f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.272315 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dcfh2"] Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.274726 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.274736 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.283229 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l98g5\" (UniqueName: \"kubernetes.io/projected/b1e7ea15-da6a-42c4-b136-34291813a34c-kube-api-access-l98g5\") pod \"ingress-canary-pt54n\" (UID: \"b1e7ea15-da6a-42c4-b136-34291813a34c\") " pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.288624 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.289852 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" event={"ID":"512c9f47-351e-4c5a-9119-f25a3500fc6e","Type":"ContainerStarted","Data":"b00d1a643fe532a0055db2cd20d7284659010737b7dd308c34f2ae57a2b40601"} Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.294913 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlfv\" (UniqueName: \"kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv\") pod \"marketplace-operator-79b997595-qv8l5\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.298684 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-49228" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.301130 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" event={"ID":"6a286474-0f76-4707-b476-0e30a49fcc32","Type":"ContainerStarted","Data":"61fae04547dadc48b45e3f13fc61a67dd3e57d6477818c6e0ab789b4862d4d41"} Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.303921 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gttnn\" (UniqueName: \"kubernetes.io/projected/6afcb621-c97d-4bc2-a310-3eb43f3508b0-kube-api-access-gttnn\") pod \"catalog-operator-68c6474976-8zp5q\" (UID: \"6afcb621-c97d-4bc2-a310-3eb43f3508b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.307568 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.312753 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pt54n" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.317668 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" event={"ID":"604dd149-ccc8-492a-a624-fd3088ed3bab","Type":"ContainerStarted","Data":"51a71d8d9fb1736539716d873b82aa5675de17794f330fd01e41bd19b93c373f"} Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.323011 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txf85" event={"ID":"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8","Type":"ContainerStarted","Data":"2f3cdb8069e5058fd419af445084bc26833287e5e534ebf54d302be2dea79342"} Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.329139 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.329642 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.829618605 +0000 UTC m=+215.428806226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.344939 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8gqv\" (UniqueName: \"kubernetes.io/projected/3e7741bb-cf1b-422c-8cd6-2c8b41d3d831-kube-api-access-c8gqv\") pod \"service-ca-9c57cc56f-v4pxp\" (UID: \"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831\") " pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.346319 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvcb\" (UniqueName: \"kubernetes.io/projected/54ea90ec-9918-4a13-ae59-b986b0f06c66-kube-api-access-prvcb\") pod \"machine-config-operator-74547568cd-z4fwm\" (UID: \"54ea90ec-9918-4a13-ae59-b986b0f06c66\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.350536 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" event={"ID":"366ffb79-635d-4f44-b22f-3fd5a77bd022","Type":"ContainerStarted","Data":"cae2005bab7583e88fa48b96b140f54165e93fb821abfb28888791e4123c9d62"} Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.376140 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdq24\" (UniqueName: \"kubernetes.io/projected/da29fa7d-f88e-4b1c-8aeb-e8f4116798ae-kube-api-access-cdq24\") pod \"kube-storage-version-migrator-operator-b67b599dd-fj57c\" (UID: \"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.381756 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.382908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7s4\" (UniqueName: \"kubernetes.io/projected/d8ef3924-86f8-4cbf-90f8-c2d364efa0b6-kube-api-access-ml7s4\") pod \"ingress-operator-5b745b69d9-nw2wt\" (UID: \"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.397753 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzm4s\" (UniqueName: \"kubernetes.io/projected/03bc0ce6-d425-4864-883a-fdc9d6e2a460-kube-api-access-mzm4s\") pod \"multus-admission-controller-857f4d67dd-dz6vg\" (UID: \"03bc0ce6-d425-4864-883a-fdc9d6e2a460\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.411537 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p8dd\" (UniqueName: \"kubernetes.io/projected/67487f25-053f-451f-bbcb-a9487d58802e-kube-api-access-2p8dd\") pod \"machine-config-server-7rvrq\" (UID: \"67487f25-053f-451f-bbcb-a9487d58802e\") " pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.427782 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn"] Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.431166 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.432648 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:50.93262621 +0000 UTC m=+215.531813901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.433137 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe66d52c-4339-4661-ac23-8ab2e68e22cc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mw5nn\" (UID: \"fe66d52c-4339-4661-ac23-8ab2e68e22cc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.448157 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.456047 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2599\" (UniqueName: \"kubernetes.io/projected/665daba4-aaad-4285-9bfb-c58983b35d2a-kube-api-access-n2599\") pod \"router-default-5444994796-9qhf4\" (UID: \"665daba4-aaad-4285-9bfb-c58983b35d2a\") " pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.475164 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.482018 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k599h\" (UniqueName: \"kubernetes.io/projected/3efaa2ef-c8d6-4ea3-bed8-407478c48db8-kube-api-access-k599h\") pod \"etcd-operator-b45778765-9255f\" (UID: \"3efaa2ef-c8d6-4ea3-bed8-407478c48db8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.489170 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.517603 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngnsc\" (UniqueName: \"kubernetes.io/projected/b0399b5b-fd72-481f-b94b-ee3e3ab0db6d-kube-api-access-ngnsc\") pod \"package-server-manager-789f6589d5-2chlb\" (UID: \"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.532322 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.532763 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.032731861 +0000 UTC m=+215.631919482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.535050 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z2cx\" (UniqueName: \"kubernetes.io/projected/45eb9355-906a-43cb-880b-b7790e7ef4f2-kube-api-access-5z2cx\") pod \"service-ca-operator-777779d784-vttdd\" (UID: \"45eb9355-906a-43cb-880b-b7790e7ef4f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.542480 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g467\" (UniqueName: \"kubernetes.io/projected/8f070ed7-44cb-4a09-9f68-0efde6f58169-kube-api-access-5g467\") pod \"packageserver-d55dfcdfc-nmbcl\" (UID: \"8f070ed7-44cb-4a09-9f68-0efde6f58169\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.544187 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.553327 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.565171 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.584061 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.592323 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7rvrq" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.634990 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.635341 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.135324474 +0000 UTC m=+215.734512095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.674870 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.687755 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.692483 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.698787 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.705156 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.736634 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.736845 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.236818325 +0000 UTC m=+215.836005946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.737013 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.737398 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.237391552 +0000 UTC m=+215.836579173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.758268 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.834309 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.837621 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.837766 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.337743619 +0000 UTC m=+215.936931240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.837798 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.838140 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.338129191 +0000 UTC m=+215.937316812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:50 crc kubenswrapper[4939]: I0318 15:40:50.939034 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:50 crc kubenswrapper[4939]: E0318 15:40:50.939448 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.439432696 +0000 UTC m=+216.038620317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.041844 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.042287 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.542270426 +0000 UTC m=+216.141458047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.145742 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.146892 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.646872867 +0000 UTC m=+216.246060488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.228452 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk"] Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.230322 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.232210 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56"] Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.248694 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.249059 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.749043248 +0000 UTC m=+216.348230869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.350117 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.350662 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.850640552 +0000 UTC m=+216.449828173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.356357 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" event={"ID":"4306eb36-80d5-404b-8909-dd446ee88230","Type":"ContainerStarted","Data":"c5bc8946e2b0fbc6ecfced6a61d40545f8b263b63fa8563ae57243f895348efb"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.356409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" event={"ID":"4306eb36-80d5-404b-8909-dd446ee88230","Type":"ContainerStarted","Data":"2239213c6c6a7cd0dc669e3841448d7202ffa180eb48f973efa68a0df7eebeda"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.357743 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.367131 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" event={"ID":"604dd149-ccc8-492a-a624-fd3088ed3bab","Type":"ContainerStarted","Data":"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.368233 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.399855 4939 patch_prober.go:28] interesting pod/console-operator-58897d9998-dcfh2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.400276 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" podUID="4306eb36-80d5-404b-8909-dd446ee88230" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.420458 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" event={"ID":"134347ee-9f68-45f7-b66c-dc2493eac221","Type":"ContainerStarted","Data":"fdf2f78b5ff3fcdbd50fcfbdde80ab8a959fa4b62567bd16208da1538238e33b"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.420660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" event={"ID":"134347ee-9f68-45f7-b66c-dc2493eac221","Type":"ContainerStarted","Data":"c488fa014978d1a4d4187690e4779fdd32ba1172137d0f9fb72d26a149d4ec2a"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.426308 4939 generic.go:334] "Generic (PLEG): container finished" podID="d56ba309-f93a-4b35-8f2b-a0f7fe561fc8" containerID="8f3509fec575665355e730f8182fc2dd64496d2d07d92d244b633f445e7fcee2" exitCode=0 Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.426409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txf85" event={"ID":"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8","Type":"ContainerDied","Data":"8f3509fec575665355e730f8182fc2dd64496d2d07d92d244b633f445e7fcee2"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.433107 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" event={"ID":"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140","Type":"ContainerStarted","Data":"63e9443548152ac45ac1f127715f20cfcc072f8459c3ade6800297d53403df53"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.433151 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" event={"ID":"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140","Type":"ContainerStarted","Data":"996df74bf65075865c0e0a9290915e64d3a50797b302aa536339e3fa1a7afb47"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.434030 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" event={"ID":"6a286474-0f76-4707-b476-0e30a49fcc32","Type":"ContainerStarted","Data":"cb2b15408c86e4e8adb5172d304e4048db681a9da4fb8f2f6ad0b514b026016b"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.442408 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9qhf4" event={"ID":"665daba4-aaad-4285-9bfb-c58983b35d2a","Type":"ContainerStarted","Data":"e6f5fa18611ae86fbb9842abdd00cc0f0337ba71efe0b74ff04d24941cde474e"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.442448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9qhf4" event={"ID":"665daba4-aaad-4285-9bfb-c58983b35d2a","Type":"ContainerStarted","Data":"65f4d7a7e356a2d766585b3251c2b4e8e05e29dcf2236c436220fd79e82fb9a9"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.444201 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7rvrq" event={"ID":"67487f25-053f-451f-bbcb-a9487d58802e","Type":"ContainerStarted","Data":"292027dd5f15f3a7550db1b2f52c0b12e5d51fc064611d2154556cabf3e4a57b"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.445788 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" event={"ID":"366ffb79-635d-4f44-b22f-3fd5a77bd022","Type":"ContainerStarted","Data":"23730f0ccd1dcd04b717cdffbdb92875cdfb68003e0e2de7890f674a010bd3f2"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.445816 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" event={"ID":"366ffb79-635d-4f44-b22f-3fd5a77bd022","Type":"ContainerStarted","Data":"f6567206d3e12e9810cde1f4e88addc0b30a62e47eab5dca2ee6e0105c5df3f8"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.449253 4939 generic.go:334] "Generic (PLEG): container finished" podID="512c9f47-351e-4c5a-9119-f25a3500fc6e" containerID="f332c59bc699478dfe887bcca02d837a70c0079e1b65d024fb0d6a90ef2d0775" exitCode=0 Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.450305 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" event={"ID":"512c9f47-351e-4c5a-9119-f25a3500fc6e","Type":"ContainerDied","Data":"f332c59bc699478dfe887bcca02d837a70c0079e1b65d024fb0d6a90ef2d0775"} Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.452871 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.454103 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:51.954085769 +0000 UTC m=+216.553273380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.557249 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.558537 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.058477824 +0000 UTC m=+216.657665445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.587439 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" podStartSLOduration=155.587412463 podStartE2EDuration="2m35.587412463s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:51.557092214 +0000 UTC m=+216.156279835" watchObservedRunningTime="2026-03-18 15:40:51.587412463 +0000 UTC m=+216.186600084" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.639869 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vqqr2"] Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.660222 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.660832 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.16080997 +0000 UTC m=+216.759997771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.674815 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk"] Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.702240 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.759549 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.761513 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.761756 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.261730964 +0000 UTC m=+216.860918585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.761854 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.762161 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.262150986 +0000 UTC m=+216.861338607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.841559 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.841624 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 15:40:51 crc kubenswrapper[4939]: W0318 15:40:51.842860 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330c585e_3a67_4502_b800_7401df959334.slice/crio-519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d WatchSource:0}: Error finding container 519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d: Status 404 returned error can't find the container with id 519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.845227 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:40:51 crc kubenswrapper[4939]: W0318 15:40:51.845592 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b97b6e8_80c7_4467_a4d1_9e4848ace365.slice/crio-e90c766af97d7448fc0c4567ba1281c0fbb92ab61b01c12cba7216a24ffbbc37 WatchSource:0}: Error finding container e90c766af97d7448fc0c4567ba1281c0fbb92ab61b01c12cba7216a24ffbbc37: Status 404 returned error can't find the container with id e90c766af97d7448fc0c4567ba1281c0fbb92ab61b01c12cba7216a24ffbbc37 Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.864225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.865165 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.365141671 +0000 UTC m=+216.964329292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:51 crc kubenswrapper[4939]: I0318 15:40:51.967076 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:51 crc kubenswrapper[4939]: E0318 15:40:51.974395 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.474352035 +0000 UTC m=+217.073539656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.079315 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.080276 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.580251264 +0000 UTC m=+217.179438875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.176485 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9qhf4" podStartSLOduration=156.176448062 podStartE2EDuration="2m36.176448062s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.172703013 +0000 UTC m=+216.771890634" watchObservedRunningTime="2026-03-18 15:40:52.176448062 +0000 UTC m=+216.775635683" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.181458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.181800 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.681786876 +0000 UTC m=+217.280974497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.210470 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" podStartSLOduration=156.210454447 podStartE2EDuration="2m36.210454447s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.207847241 +0000 UTC m=+216.807034862" watchObservedRunningTime="2026-03-18 15:40:52.210454447 +0000 UTC m=+216.809642058" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.256569 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6fdjn" podStartSLOduration=156.256547773 podStartE2EDuration="2m36.256547773s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.255633136 +0000 UTC m=+216.854820767" watchObservedRunningTime="2026-03-18 15:40:52.256547773 +0000 UTC m=+216.855735394" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.282275 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.282600 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.782585287 +0000 UTC m=+217.381772908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.308610 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gjhzf" podStartSLOduration=156.308589941 podStartE2EDuration="2m36.308589941s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.295910153 +0000 UTC m=+216.895097784" watchObservedRunningTime="2026-03-18 15:40:52.308589941 +0000 UTC m=+216.907777562" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.311196 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.328730 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.333103 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7gmkn"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.345553 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.350456 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.354015 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.356873 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-49228"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.380365 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4"] Mar 18 15:40:52 crc kubenswrapper[4939]: W0318 15:40:52.382636 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc2961e_26df_420a_9e5b_8abd43b85b2a.slice/crio-66ba67ae4910b8efeff357de159592e6f403f399beded29ba3d9e488f4762ff2 WatchSource:0}: Error finding container 66ba67ae4910b8efeff357de159592e6f403f399beded29ba3d9e488f4762ff2: Status 404 returned error can't find the container with id 66ba67ae4910b8efeff357de159592e6f403f399beded29ba3d9e488f4762ff2 Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.383079 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.383326 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.883316925 +0000 UTC m=+217.482504536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.393311 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.449780 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v4pxp"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.471782 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dz6vg"] Mar 18 15:40:52 crc kubenswrapper[4939]: W0318 15:40:52.479358 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2adddc7a_6b85_45dc_abf2_611a810581ad.slice/crio-63bb66f925eb384e74589c6b34c070988713d9f86a076bb02dcccfb18bfb2b13 WatchSource:0}: Error finding container 63bb66f925eb384e74589c6b34c070988713d9f86a076bb02dcccfb18bfb2b13: Status 404 returned error can't find the container with id 63bb66f925eb384e74589c6b34c070988713d9f86a076bb02dcccfb18bfb2b13 Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.489559 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.490406 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:52.990386378 +0000 UTC m=+217.589573999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.495491 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" event={"ID":"b356cad2-134f-4910-875f-71c38fda3cee","Type":"ContainerStarted","Data":"90249adfd1e82f5850e3b60d3862f3608f15da72fede112e7ae5cb6667e0977f"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.497366 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fdwn"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.509959 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2lc49"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.520410 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vttdd"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.529625 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffvz"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.530218 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7gmkn" event={"ID":"e65663eb-5c53-4a9f-81d7-6356a33dc7b7","Type":"ContainerStarted","Data":"d01c90345cd2687dd9bca85bec6b47fcc4c99e0b54b994512dbb8b25f107259a"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.534472 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.536658 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" podStartSLOduration=156.536629028 podStartE2EDuration="2m36.536629028s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.450836142 +0000 UTC m=+217.050023763" watchObservedRunningTime="2026-03-18 15:40:52.536629028 +0000 UTC m=+217.135816649" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.546681 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" event={"ID":"3dc2961e-26df-420a-9e5b-8abd43b85b2a","Type":"ContainerStarted","Data":"66ba67ae4910b8efeff357de159592e6f403f399beded29ba3d9e488f4762ff2"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.547406 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.550723 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" event={"ID":"34955821-9f88-452b-b5c9-f86a93e3f427","Type":"ContainerStarted","Data":"0751156e9e09f164f3c2fce682e6cfb4cdec8b1b040eac74ad0c2bdfd5292580"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.551718 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5lh2v" podStartSLOduration=156.551703214 podStartE2EDuration="2m36.551703214s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.479968726 +0000 UTC m=+217.079156367" watchObservedRunningTime="2026-03-18 15:40:52.551703214 +0000 UTC m=+217.150890835" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.562892 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" event={"ID":"512c9f47-351e-4c5a-9119-f25a3500fc6e","Type":"ContainerStarted","Data":"acb43c7601abfaf3c3c9aff8eda8a4377907896d6e57e494cfd11ea2027a12ba"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.562955 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.564396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" event={"ID":"54ea90ec-9918-4a13-ae59-b986b0f06c66","Type":"ContainerStarted","Data":"0e456a786b26f61918b62d09ff66c95d16709a6ef02e25eb2cb32d0787c83ae4"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.566195 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" event={"ID":"330c585e-3a67-4502-b800-7401df959334","Type":"ContainerStarted","Data":"519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.567213 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.571037 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pt54n"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.573739 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.578906 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" event={"ID":"f8d184fd-e8e8-43c7-8961-fb9aa266b149","Type":"ContainerStarted","Data":"29c5895ba8cc7b3376bb2b56308cd5fb97e3b3b11aa3c4cee3a6c591ae5dcf11"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.578941 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" event={"ID":"f8d184fd-e8e8-43c7-8961-fb9aa266b149","Type":"ContainerStarted","Data":"cc9fa7232c2448865dfff229f0bd5942bccd02a0cf423da449ebddf895d3b961"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.579715 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.581999 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txf85" event={"ID":"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8","Type":"ContainerStarted","Data":"95b6da0c90a69d652ff4307bfff8dd4b17ddd6ce73755a1247bfc0c1f43b0c66"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.582171 4939 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z46b4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.582223 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.584920 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" event={"ID":"88bed82e-aed3-4f18-9908-f94900c9e60d","Type":"ContainerStarted","Data":"e561ae67cd1011c34bbf76b97a637d7432830d52c7f15d2acc56c381b6489bb1"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.584963 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" event={"ID":"88bed82e-aed3-4f18-9908-f94900c9e60d","Type":"ContainerStarted","Data":"bc85eb90bbf50abc242220ca406b1e685543506128dfa3c4484f47a87136f6b7"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.584977 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" event={"ID":"88bed82e-aed3-4f18-9908-f94900c9e60d","Type":"ContainerStarted","Data":"b96d7c2a2bf52820a92f2c6e752e65a6ef18d301d3f1a2d291b0a9850d32ac57"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.587537 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.590813 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" podStartSLOduration=156.590795727 podStartE2EDuration="2m36.590795727s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.590111287 +0000 UTC m=+217.189298898" watchObservedRunningTime="2026-03-18 15:40:52.590795727 +0000 UTC m=+217.189983348" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.596814 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.597245 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.097228934 +0000 UTC m=+217.696416555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.598359 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7rvrq" event={"ID":"67487f25-053f-451f-bbcb-a9487d58802e","Type":"ContainerStarted","Data":"dbf156fe6a4d0d3cdddc85fc248169d4a8e859ef3f9e23318549078741e4b5d4"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.604131 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-49228" event={"ID":"45453fff-856e-44f4-b05d-d00a91d00429","Type":"ContainerStarted","Data":"59c5bac5880c0be0d55e186511c3249465aeab8e68404378cce74469b1d435f1"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.606135 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" event={"ID":"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de","Type":"ContainerStarted","Data":"4ca9c2962ee99c02e3e4d691242562435a82743d49c11f3e4e267a8c3e702631"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.613195 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" event={"ID":"1b97b6e8-80c7-4467-a4d1-9e4848ace365","Type":"ContainerStarted","Data":"955adcf87f6d76362983801964b8a9dd12ab8148d585c4f890d8c1c677d2b598"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.613234 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" event={"ID":"1b97b6e8-80c7-4467-a4d1-9e4848ace365","Type":"ContainerStarted","Data":"e90c766af97d7448fc0c4567ba1281c0fbb92ab61b01c12cba7216a24ffbbc37"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.615516 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" event={"ID":"672c98cc-4cc7-4827-b7e2-be9dee6ba3de","Type":"ContainerStarted","Data":"e75812795d8eb7d95740073495d7d179e871a7073516a0d94697dd1b9c0a197e"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.622618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" event={"ID":"2e3b1a81-ac3c-4f98-b0f4-04a68ef2a140","Type":"ContainerStarted","Data":"6e6eb5b25092fb93b45b7fa6538a5554436d5f07b40ff04b5fe3d4c3e2cf8e05"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.625951 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pvl56" podStartSLOduration=156.625663588 podStartE2EDuration="2m36.625663588s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.619404596 +0000 UTC m=+217.218592247" watchObservedRunningTime="2026-03-18 15:40:52.625663588 +0000 UTC m=+217.224851219" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.651386 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" event={"ID":"7e375e2a-ef39-4b07-aa49-1a498266a487","Type":"ContainerStarted","Data":"4ac5b62dc4bdb23dd1881512c74ddad93ad5a48c6966d9ac0abc87747b5edcbd"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.651950 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" event={"ID":"7e375e2a-ef39-4b07-aa49-1a498266a487","Type":"ContainerStarted","Data":"72212ac39e1d8ff26a0a8581dba1c95fec19904c93df14108d45a5a0b09ceb11"} Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.666641 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" podStartSLOduration=156.666626415 podStartE2EDuration="2m36.666626415s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.644175344 +0000 UTC m=+217.243362975" watchObservedRunningTime="2026-03-18 15:40:52.666626415 +0000 UTC m=+217.265814036" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.690250 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7rvrq" podStartSLOduration=5.690232489 podStartE2EDuration="5.690232489s" podCreationTimestamp="2026-03-18 15:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.690032643 +0000 UTC m=+217.289220264" watchObservedRunningTime="2026-03-18 15:40:52.690232489 +0000 UTC m=+217.289420110" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.695090 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjcgk" podStartSLOduration=156.695079089 podStartE2EDuration="2m36.695079089s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.667452089 +0000 UTC m=+217.266639720" watchObservedRunningTime="2026-03-18 15:40:52.695079089 +0000 UTC m=+217.294266710" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.704525 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.705260 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.205245344 +0000 UTC m=+217.804432965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.705559 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.707061 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.207053136 +0000 UTC m=+217.806240757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.717666 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.743565 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f7r46" podStartSLOduration=156.743542043 podStartE2EDuration="2m36.743542043s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.722079691 +0000 UTC m=+217.321267302" watchObservedRunningTime="2026-03-18 15:40:52.743542043 +0000 UTC m=+217.342729664" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.744360 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-957qk" podStartSLOduration=156.744355457 podStartE2EDuration="2m36.744355457s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:52.742303487 +0000 UTC m=+217.341491108" watchObservedRunningTime="2026-03-18 15:40:52.744355457 +0000 UTC m=+217.343543078" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.752361 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9255f"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.754994 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.767996 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:52 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:52 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:52 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.768545 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.775041 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.791789 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.810313 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.811498 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.311481082 +0000 UTC m=+217.910668703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.813911 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.814293 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.314285353 +0000 UTC m=+217.913472974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.817085 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t"] Mar 18 15:40:52 crc kubenswrapper[4939]: I0318 15:40:52.915111 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:52 crc kubenswrapper[4939]: E0318 15:40:52.916357 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.41632956 +0000 UTC m=+218.015517181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.017048 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.017894 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.517878743 +0000 UTC m=+218.117066364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.118432 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.119523 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.619462217 +0000 UTC m=+218.218649838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.119802 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.120260 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.620253099 +0000 UTC m=+218.219440720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.220672 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.221058 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.72103861 +0000 UTC m=+218.320226231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.321856 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.322145 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.822134169 +0000 UTC m=+218.421321790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.423118 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.423290 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.92325744 +0000 UTC m=+218.522445061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.423408 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.423874 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:53.923863507 +0000 UTC m=+218.523051228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.503026 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58420: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.524579 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.524746 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.02471662 +0000 UTC m=+218.623904251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.524877 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.525192 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.025177233 +0000 UTC m=+218.624364864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.605658 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58426: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.625650 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.625779 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.125757158 +0000 UTC m=+218.724944779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.626065 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.626456 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.126440767 +0000 UTC m=+218.725628388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.634458 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58438: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.662465 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58450: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.671489 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" event={"ID":"8f070ed7-44cb-4a09-9f68-0efde6f58169","Type":"ContainerStarted","Data":"2ecfb824d92fed7dfd48dd36ad4dbfb0dc3e4c500ad8eba65a031d4b5b5626ab"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.673306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" event={"ID":"b6b327b7-cf92-4bf3-9047-6c36ecf787b6","Type":"ContainerStarted","Data":"491e0b0f02b66b3023203e3c8d50dbeed10eb32737f48db0cd40166ca96aecd2"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.687851 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.687899 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.695820 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58454: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.696269 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-txf85" event={"ID":"d56ba309-f93a-4b35-8f2b-a0f7fe561fc8","Type":"ContainerStarted","Data":"984840f5bfbc8838708378237c9d121bf8cbfaf551cbdc3e11bf68384a6d4f96"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.720206 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-txf85" podStartSLOduration=157.720190824 podStartE2EDuration="2m37.720190824s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:53.718145745 +0000 UTC m=+218.317333366" watchObservedRunningTime="2026-03-18 15:40:53.720190824 +0000 UTC m=+218.319378435" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.725016 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" event={"ID":"ef08b3ba-1294-43dd-af5b-98550ede648f","Type":"ContainerStarted","Data":"051adcbcc2774e95c36f98646ceaccd68767c56a1da271e0b75c420df149ef57"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.726817 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.726955 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.226934559 +0000 UTC m=+218.826122180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.727040 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.727378 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.227370122 +0000 UTC m=+218.826557823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.729901 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" event={"ID":"1757e629-858a-4c03-8a61-119198d1dc83","Type":"ContainerStarted","Data":"8d69fbbf10b34b7c3e9b22eb05d4e377946eb1fd64434150aa3b51ea4a56f1a1"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.731635 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-49228" event={"ID":"45453fff-856e-44f4-b05d-d00a91d00429","Type":"ContainerStarted","Data":"c211369b75ab3dd5aa45a202a9a5c6fc9c77a88eeb7a5fe77512625c2816ab65"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.733483 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" event={"ID":"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de","Type":"ContainerStarted","Data":"60447d0394151a5ef4832f2882a562382589b6ce445fefd29fb1b412eba7df48"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.735275 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pt54n" event={"ID":"b1e7ea15-da6a-42c4-b136-34291813a34c","Type":"ContainerStarted","Data":"261cc7e22b71c986000267287643db4f2bd291ee12e63a56ecafae213fa693ef"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.735349 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58470: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.736235 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" event={"ID":"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831","Type":"ContainerStarted","Data":"bbfc03968129020adf41d94993269f75f88212e82c6cd88d552f881b51b6a2d4"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.741053 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" event={"ID":"51e9250d-312d-4e6b-9a21-bfe25d4533ff","Type":"ContainerStarted","Data":"3c4b1274b6fd39347c0908bb0870159cd742ffaae7fb8f1190378d64b2dbf4a3"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.751672 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" event={"ID":"fe66d52c-4339-4661-ac23-8ab2e68e22cc","Type":"ContainerStarted","Data":"feb219090ac57022122e5048cde92ecef66f81f402ac116b9e341a9622863ab1"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.755086 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" event={"ID":"03bc0ce6-d425-4864-883a-fdc9d6e2a460","Type":"ContainerStarted","Data":"5fa998e329b018d822cf22b6a0d438f6fa482caecbd9781e70593216b78f3fd7"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.756517 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" event={"ID":"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6","Type":"ContainerStarted","Data":"2d11f8409b7d375821113f2f9e511fac6b9046b5896ee96b507451db48d6141c"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.759462 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerStarted","Data":"20150ca69b9c0ccb699e1f8b44e5a322aecb7f29049f4584f40f083961deb443"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.760905 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" event={"ID":"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae","Type":"ContainerStarted","Data":"c7b4d01bebc2f1c47e17def4d55f59980094262cdc1d6e8b81cf726a5a103e10"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.761919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nqdwc" event={"ID":"7a2bfef5-fef2-4e27-9749-53ea69f13c0f","Type":"ContainerStarted","Data":"f83fa5e0600882622f51c8118aad18f38b86beb089f8424f40b48b71e7ce0d15"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.762674 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:53 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:53 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:53 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.762739 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.774321 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" event={"ID":"45eb9355-906a-43cb-880b-b7790e7ef4f2","Type":"ContainerStarted","Data":"9e8cff36844067bec122b39911bbfe513c0a7753116239acd9d4b984d6bbc8b2"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.778290 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" event={"ID":"3dc2961e-26df-420a-9e5b-8abd43b85b2a","Type":"ContainerStarted","Data":"e73e54e44420be444a71e9fed64e71295a873fb90d22c2668d975f1e822ec881"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.782587 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" event={"ID":"3efaa2ef-c8d6-4ea3-bed8-407478c48db8","Type":"ContainerStarted","Data":"12fb30ffedbd020668091f78da8898f556953857180c980d11ec5a13928b996a"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.783605 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" event={"ID":"2adddc7a-6b85-45dc-abf2-611a810581ad","Type":"ContainerStarted","Data":"63bb66f925eb384e74589c6b34c070988713d9f86a076bb02dcccfb18bfb2b13"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.784696 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" event={"ID":"6afcb621-c97d-4bc2-a310-3eb43f3508b0","Type":"ContainerStarted","Data":"dee9d458017918e9efa38fa93dbe100570184154a85455e1ecd90f113575970f"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.786209 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" event={"ID":"54ea90ec-9918-4a13-ae59-b986b0f06c66","Type":"ContainerStarted","Data":"887cd428c8e382fb29afa4572361451d5c25f4bcc9cfcc762ca372b315193b2b"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.787747 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" event={"ID":"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d","Type":"ContainerStarted","Data":"844aa48c80abacdee8a535060eb17506c6feb0fb552d2e22bc9070c89aee4981"} Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.790109 4939 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z46b4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.790149 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.793283 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7mf68" podStartSLOduration=157.793261971 podStartE2EDuration="2m37.793261971s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:53.792992054 +0000 UTC m=+218.392179675" watchObservedRunningTime="2026-03-18 15:40:53.793261971 +0000 UTC m=+218.392449592" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.794135 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" podStartSLOduration=157.794125166 podStartE2EDuration="2m37.794125166s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:53.752794099 +0000 UTC m=+218.351981720" watchObservedRunningTime="2026-03-18 15:40:53.794125166 +0000 UTC m=+218.393312787" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.828720 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.829576 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.329554593 +0000 UTC m=+218.928742224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.922711 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58480: no serving certificate available for the kubelet" Mar 18 15:40:53 crc kubenswrapper[4939]: I0318 15:40:53.931915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:53 crc kubenswrapper[4939]: E0318 15:40:53.932288 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.43227244 +0000 UTC m=+219.031460061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.033512 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.033740 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.533708699 +0000 UTC m=+219.132896340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.135113 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.135534 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.635488438 +0000 UTC m=+219.234676079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.236060 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.236220 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.736193746 +0000 UTC m=+219.335381367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.236725 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.237026 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.7370191 +0000 UTC m=+219.336206721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.269951 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58488: no serving certificate available for the kubelet" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.337620 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.338099 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.838079439 +0000 UTC m=+219.437267070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.439995 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.440605 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:54.940582089 +0000 UTC m=+219.539769710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.542079 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.542351 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.042318517 +0000 UTC m=+219.641506188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.542723 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.543043 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.043033158 +0000 UTC m=+219.642220779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.559029 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.643406 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.644323 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.144290182 +0000 UTC m=+219.743477803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.745695 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.746379 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.24636815 +0000 UTC m=+219.845555771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.766381 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:54 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:54 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:54 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.766448 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.796878 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" event={"ID":"3efaa2ef-c8d6-4ea3-bed8-407478c48db8","Type":"ContainerStarted","Data":"92448a7cb39a12df33ec381c3d64a1877e123bc5457e681f44619bb700dd613f"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.799154 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" event={"ID":"03bc0ce6-d425-4864-883a-fdc9d6e2a460","Type":"ContainerStarted","Data":"27362cd3e19a658332b9eb1050119c7a4df766af6ae3eee5fe8e0c6c6a79dfb7"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.799185 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" event={"ID":"03bc0ce6-d425-4864-883a-fdc9d6e2a460","Type":"ContainerStarted","Data":"905fabd81ea4ae4d4e4e97906963878b155e4aa6b936d7e588153a73bbe807b2"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.802372 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" event={"ID":"6afcb621-c97d-4bc2-a310-3eb43f3508b0","Type":"ContainerStarted","Data":"213e488fc283de6ee11305c69a223f0c9f1659007c7da56ce9305200f894a9ab"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.802926 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.804656 4939 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8zp5q container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.804700 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" podUID="6afcb621-c97d-4bc2-a310-3eb43f3508b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.805252 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerStarted","Data":"d37c5e68a2dcf25b1365721fc0cb97b084ebf2ef0a20dacf24da23584cc09ba1"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.805620 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.806576 4939 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qv8l5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.806635 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.808399 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" event={"ID":"34955821-9f88-452b-b5c9-f86a93e3f427","Type":"ContainerStarted","Data":"4ac3691414e5bc39d14a05aa0a61224af6ef65dbefbf1d7761caf38d0d363680"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.808426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" event={"ID":"34955821-9f88-452b-b5c9-f86a93e3f427","Type":"ContainerStarted","Data":"065788c6641502a00f88e935ada7857bb8729303838238d9903f42c62f92a749"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.811483 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" event={"ID":"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d","Type":"ContainerStarted","Data":"43538602a22c0285d4046edd5515b316a946dadd6237906fc3acc13467f5c53c"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.811539 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" event={"ID":"b0399b5b-fd72-481f-b94b-ee3e3ab0db6d","Type":"ContainerStarted","Data":"4e442a291b590d7d69966e39fffad22e3947dff4d230c2aad7a16dd7674b94dc"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.811583 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.812876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7gmkn" event={"ID":"e65663eb-5c53-4a9f-81d7-6356a33dc7b7","Type":"ContainerStarted","Data":"cfb50daa7e1a1e6ccc3238744bdf0b4d4039b915a1acfb2c250e4183b9f9f088"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.813784 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.816145 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.816204 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.820444 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" event={"ID":"b6b327b7-cf92-4bf3-9047-6c36ecf787b6","Type":"ContainerStarted","Data":"f689a3be4a075f3d790a425a43de55880dd4365df15666c364cdcca661c1d752"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.831981 4939 generic.go:334] "Generic (PLEG): container finished" podID="1757e629-858a-4c03-8a61-119198d1dc83" containerID="cf873598a6f1d4782d0f210f45acea0b47a3c24196776f39313d6df23a29d0f1" exitCode=0 Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.832365 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" event={"ID":"1757e629-858a-4c03-8a61-119198d1dc83","Type":"ContainerDied","Data":"cf873598a6f1d4782d0f210f45acea0b47a3c24196776f39313d6df23a29d0f1"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.834828 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-49228" event={"ID":"45453fff-856e-44f4-b05d-d00a91d00429","Type":"ContainerStarted","Data":"d61f5b67185d08503156055e176a21b9c86e65ae0a5f0b2c59db608d334f9d4b"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.835269 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-49228" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.836855 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" event={"ID":"8f070ed7-44cb-4a09-9f68-0efde6f58169","Type":"ContainerStarted","Data":"f7aae7647802bf533722ec7058f8121a5286a192d32acf5bc938d66e08318fce"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.837556 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.839271 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" event={"ID":"b356cad2-134f-4910-875f-71c38fda3cee","Type":"ContainerStarted","Data":"1ddf328e9b5d6e0da07017cf48a0da30b4b1ee909ebd1410908316e2e63d3b65"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.841848 4939 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nmbcl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:5443/healthz\": dial tcp 10.217.0.20:5443: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.841908 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" podUID="8f070ed7-44cb-4a09-9f68-0efde6f58169" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.20:5443/healthz\": dial tcp 10.217.0.20:5443: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.842773 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9255f" podStartSLOduration=158.842763973 podStartE2EDuration="2m38.842763973s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.84231728 +0000 UTC m=+219.441504891" watchObservedRunningTime="2026-03-18 15:40:54.842763973 +0000 UTC m=+219.441951594" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.847107 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.847534 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.347519621 +0000 UTC m=+219.946707242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.853882 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" event={"ID":"51e9250d-312d-4e6b-9a21-bfe25d4533ff","Type":"ContainerStarted","Data":"a2622f14df02929ba2241990b9b11f9ad0bbf656f43015b97097480ad044307c"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.853989 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" event={"ID":"51e9250d-312d-4e6b-9a21-bfe25d4533ff","Type":"ContainerStarted","Data":"b187d1253bfec4b46d27f114e89c48474536c62e5adc5838b0ba04c73f014ea1"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.855957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" event={"ID":"3e7741bb-cf1b-422c-8cd6-2c8b41d3d831","Type":"ContainerStarted","Data":"174307e4c976f1c1bfda69185170932f3af2fe45c9720717a31ad59e05186fd9"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.858169 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" event={"ID":"ef08b3ba-1294-43dd-af5b-98550ede648f","Type":"ContainerStarted","Data":"6e571e19292251ba727dcbe9cccd60a95abbbb412927661f152f4d863fe914f9"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.858223 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" event={"ID":"ef08b3ba-1294-43dd-af5b-98550ede648f","Type":"ContainerStarted","Data":"10cd5f04e8e7f4bf12171ef9c5a27612290aaa35fb6dd73995908fef0fd3a96b"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.860082 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" podStartSLOduration=158.860071175 podStartE2EDuration="2m38.860071175s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.859325193 +0000 UTC m=+219.458512814" watchObservedRunningTime="2026-03-18 15:40:54.860071175 +0000 UTC m=+219.459258806" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.870433 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" event={"ID":"54ea90ec-9918-4a13-ae59-b986b0f06c66","Type":"ContainerStarted","Data":"18c6919f2db8bbb0c605d3f7301f02b50d76c83166c9468db67b737c9a52d026"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.873640 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pt54n" event={"ID":"b1e7ea15-da6a-42c4-b136-34291813a34c","Type":"ContainerStarted","Data":"70025c30331524c56c2477a9ff93357d64c29ac66f7b1c8c3bf41bdbec442dd1"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.875028 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nqdwc" event={"ID":"7a2bfef5-fef2-4e27-9749-53ea69f13c0f","Type":"ContainerStarted","Data":"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.877661 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" event={"ID":"45eb9355-906a-43cb-880b-b7790e7ef4f2","Type":"ContainerStarted","Data":"dce9e63fdd6a25b8016bcc35e4278c298804a7e4b59df66217195a2ceb9c129f"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.880262 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffvz" podStartSLOduration=158.880250899 podStartE2EDuration="2m38.880250899s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.878869059 +0000 UTC m=+219.478056680" watchObservedRunningTime="2026-03-18 15:40:54.880250899 +0000 UTC m=+219.479438520" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.888998 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" event={"ID":"fe66d52c-4339-4661-ac23-8ab2e68e22cc","Type":"ContainerStarted","Data":"5793b7316307006712521eceb2a691ebd964954e7b9104a9746546e357f900ff"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.900757 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-49228" podStartSLOduration=7.900739873 podStartE2EDuration="7.900739873s" podCreationTimestamp="2026-03-18 15:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.898342334 +0000 UTC m=+219.497529975" watchObservedRunningTime="2026-03-18 15:40:54.900739873 +0000 UTC m=+219.499927494" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.904962 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" event={"ID":"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6","Type":"ContainerStarted","Data":"94a3e434b16214f296e6ca0e038cce59f03fb7027ce26996e895a815f9112204"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.904992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" event={"ID":"d8ef3924-86f8-4cbf-90f8-c2d364efa0b6","Type":"ContainerStarted","Data":"62a51c132b1bbf110cb57afe59d5297ea757f4e35c2de61b8326c15d7b91e3b2"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.921881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" event={"ID":"672c98cc-4cc7-4827-b7e2-be9dee6ba3de","Type":"ContainerStarted","Data":"42c8684cd26ad768d71ed998e8462b8ba27e4c3d8cf06357d70b37482dc1383a"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.922669 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.923554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sv5w4" podStartSLOduration=158.923544834 podStartE2EDuration="2m38.923544834s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.923174243 +0000 UTC m=+219.522361864" watchObservedRunningTime="2026-03-18 15:40:54.923544834 +0000 UTC m=+219.522732445" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.923713 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.923768 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.928678 4939 patch_prober.go:28] interesting pod/apiserver-76f77b778f-txf85 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.928752 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-txf85" podUID="d56ba309-f93a-4b35-8f2b-a0f7fe561fc8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.942858 4939 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2x4rk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.942901 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" podUID="672c98cc-4cc7-4827-b7e2-be9dee6ba3de" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.949200 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:54 crc kubenswrapper[4939]: E0318 15:40:54.953347 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.453333867 +0000 UTC m=+220.052521478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.956426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" event={"ID":"da29fa7d-f88e-4b1c-8aeb-e8f4116798ae","Type":"ContainerStarted","Data":"8469df2297981ff78e2e6d270a899c05707aa3e13d9bc46973d78e297da6e1c9"} Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.957543 4939 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z46b4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.957590 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.961758 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58490: no serving certificate available for the kubelet" Mar 18 15:40:54 crc kubenswrapper[4939]: I0318 15:40:54.975522 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dz6vg" podStartSLOduration=158.975491239 podStartE2EDuration="2m38.975491239s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:54.974884552 +0000 UTC m=+219.574072163" watchObservedRunningTime="2026-03-18 15:40:54.975491239 +0000 UTC m=+219.574678850" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.001474 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7gmkn" podStartSLOduration=159.001456592 podStartE2EDuration="2m39.001456592s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.000745271 +0000 UTC m=+219.599932902" watchObservedRunningTime="2026-03-18 15:40:55.001456592 +0000 UTC m=+219.600644213" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.026869 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" podStartSLOduration=159.026850537 podStartE2EDuration="2m39.026850537s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.02590493 +0000 UTC m=+219.625092551" watchObservedRunningTime="2026-03-18 15:40:55.026850537 +0000 UTC m=+219.626038168" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.055057 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.058078 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.558033151 +0000 UTC m=+220.157220782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.082653 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkkbz" podStartSLOduration=159.082633164 podStartE2EDuration="2m39.082633164s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.064039415 +0000 UTC m=+219.663227046" watchObservedRunningTime="2026-03-18 15:40:55.082633164 +0000 UTC m=+219.681820785" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.084733 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" podStartSLOduration=159.084726755 podStartE2EDuration="2m39.084726755s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.08215513 +0000 UTC m=+219.681342751" watchObservedRunningTime="2026-03-18 15:40:55.084726755 +0000 UTC m=+219.683914376" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.108384 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" podStartSLOduration=159.108361289 podStartE2EDuration="2m39.108361289s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.098636888 +0000 UTC m=+219.697824509" watchObservedRunningTime="2026-03-18 15:40:55.108361289 +0000 UTC m=+219.707548910" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.114870 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mw5nn" podStartSLOduration=159.114847887 podStartE2EDuration="2m39.114847887s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.113255441 +0000 UTC m=+219.712443062" watchObservedRunningTime="2026-03-18 15:40:55.114847887 +0000 UTC m=+219.714035508" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.160966 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2lc49" podStartSLOduration=159.160950533 podStartE2EDuration="2m39.160950533s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.16049101 +0000 UTC m=+219.759678631" watchObservedRunningTime="2026-03-18 15:40:55.160950533 +0000 UTC m=+219.760138154" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.161147 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.161416 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fj57c" podStartSLOduration=159.161409857 podStartE2EDuration="2m39.161409857s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.135366212 +0000 UTC m=+219.734553833" watchObservedRunningTime="2026-03-18 15:40:55.161409857 +0000 UTC m=+219.760597478" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.161432 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.661420727 +0000 UTC m=+220.260608348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.190851 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-nw2wt" podStartSLOduration=159.190832599 podStartE2EDuration="2m39.190832599s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.18981279 +0000 UTC m=+219.789000411" watchObservedRunningTime="2026-03-18 15:40:55.190832599 +0000 UTC m=+219.790020220" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.229222 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pt54n" podStartSLOduration=8.229204341 podStartE2EDuration="8.229204341s" podCreationTimestamp="2026-03-18 15:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.21569388 +0000 UTC m=+219.814881501" watchObservedRunningTime="2026-03-18 15:40:55.229204341 +0000 UTC m=+219.828391962" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.232300 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vttdd" podStartSLOduration=159.23228476 podStartE2EDuration="2m39.23228476s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.229275833 +0000 UTC m=+219.828463444" watchObservedRunningTime="2026-03-18 15:40:55.23228476 +0000 UTC m=+219.831472381" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.255529 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" podStartSLOduration=159.255491403 podStartE2EDuration="2m39.255491403s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.248400277 +0000 UTC m=+219.847587898" watchObservedRunningTime="2026-03-18 15:40:55.255491403 +0000 UTC m=+219.854679024" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.264142 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.264366 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.764345619 +0000 UTC m=+220.363533240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.264549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.264856 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.764843394 +0000 UTC m=+220.364031015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.264978 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xw29t" podStartSLOduration=159.264964427 podStartE2EDuration="2m39.264964427s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.264122403 +0000 UTC m=+219.863310024" watchObservedRunningTime="2026-03-18 15:40:55.264964427 +0000 UTC m=+219.864152048" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.298430 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nqdwc" podStartSLOduration=159.298412607 podStartE2EDuration="2m39.298412607s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.296881832 +0000 UTC m=+219.896069443" watchObservedRunningTime="2026-03-18 15:40:55.298412607 +0000 UTC m=+219.897600228" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.299853 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z4fwm" podStartSLOduration=159.299848388 podStartE2EDuration="2m39.299848388s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.277773769 +0000 UTC m=+219.876961390" watchObservedRunningTime="2026-03-18 15:40:55.299848388 +0000 UTC m=+219.899036009" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.312614 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-v4pxp" podStartSLOduration=159.312599468 podStartE2EDuration="2m39.312599468s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:55.311565128 +0000 UTC m=+219.910752749" watchObservedRunningTime="2026-03-18 15:40:55.312599468 +0000 UTC m=+219.911787089" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.365383 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.365558 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.865498541 +0000 UTC m=+220.464686172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.365786 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.366054 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.866046636 +0000 UTC m=+220.465234257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.466790 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.466990 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.966962091 +0000 UTC m=+220.566149712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.467061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.467385 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:55.967378163 +0000 UTC m=+220.566565784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.568815 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.568995 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.068953716 +0000 UTC m=+220.668141337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.569219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.569713 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.069705338 +0000 UTC m=+220.668892959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.670101 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.670482 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.170446837 +0000 UTC m=+220.769634458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.766198 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:55 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:55 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:55 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.766545 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.772307 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.772702 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.27268577 +0000 UTC m=+220.871873391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.873235 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.873456 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.373420859 +0000 UTC m=+220.972608480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.873708 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.874105 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.374090218 +0000 UTC m=+220.973277839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.936844 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.937411 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.940889 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.941100 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.950245 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.954715 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jd2sf" Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.974493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:55 crc kubenswrapper[4939]: E0318 15:40:55.974767 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.474752324 +0000 UTC m=+221.073939945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:55 crc kubenswrapper[4939]: I0318 15:40:55.985090 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" event={"ID":"1757e629-858a-4c03-8a61-119198d1dc83","Type":"ContainerStarted","Data":"5eebec976a8de7c299444c83ff7a12a75ed6cc9426c1e5ca73b7e5a3304b3991"} Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:55.999399 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" event={"ID":"2adddc7a-6b85-45dc-abf2-611a810581ad","Type":"ContainerStarted","Data":"b700b45716e112b0e1eaec81aca53d59344d8287aeaf3d43019fba2fc9e86982"} Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.009285 4939 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nmbcl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:5443/healthz\": dial tcp 10.217.0.20:5443: connect: connection refused" start-of-body= Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.009429 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" podUID="8f070ed7-44cb-4a09-9f68-0efde6f58169" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.20:5443/healthz\": dial tcp 10.217.0.20:5443: connect: connection refused" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.009289 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.009898 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.009308 4939 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8zp5q container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.010037 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" podUID="6afcb621-c97d-4bc2-a310-3eb43f3508b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.010531 4939 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2x4rk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.010611 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" podUID="672c98cc-4cc7-4827-b7e2-be9dee6ba3de" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.010732 4939 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qv8l5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.010819 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.066748 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" podStartSLOduration=160.066729769 podStartE2EDuration="2m40.066729769s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:56.065918746 +0000 UTC m=+220.665106367" watchObservedRunningTime="2026-03-18 15:40:56.066729769 +0000 UTC m=+220.665917390" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.080669 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.080875 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.081063 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.081586 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.581573849 +0000 UTC m=+221.180761470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.172711 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.172957 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" containerID="cri-o://29c5895ba8cc7b3376bb2b56308cd5fb97e3b3b11aa3c4cee3a6c591ae5dcf11" gracePeriod=30 Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.184538 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.184857 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.184978 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.185964 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.685941774 +0000 UTC m=+221.285129395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.187356 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.187696 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.263339 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.278427 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.278722 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" podUID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" containerName="route-controller-manager" containerID="cri-o://dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7" gracePeriod=30 Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.286281 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.286665 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.786652892 +0000 UTC m=+221.385840513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.310912 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58500: no serving certificate available for the kubelet" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.386949 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.387130 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.887104513 +0000 UTC m=+221.486292134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.387243 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.387564 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.887553126 +0000 UTC m=+221.486740747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.488648 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.488944 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:56.988928763 +0000 UTC m=+221.588116384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.555381 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.590524 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.590883 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.090867357 +0000 UTC m=+221.690054978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.691982 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.692413 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.192396519 +0000 UTC m=+221.791584140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.788548 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:56 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:56 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:56 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.788624 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.799004 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.799365 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.299353219 +0000 UTC m=+221.898540840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.902074 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.902481 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.402427736 +0000 UTC m=+222.001615377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.902648 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:56 crc kubenswrapper[4939]: E0318 15:40:56.902951 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.40293904 +0000 UTC m=+222.002126661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:56 crc kubenswrapper[4939]: I0318 15:40:56.934961 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.005446 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.006003 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.505987317 +0000 UTC m=+222.105174938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.014169 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.064803 4939 generic.go:334] "Generic (PLEG): container finished" podID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" containerID="dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7" exitCode=0 Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.064871 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" event={"ID":"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976","Type":"ContainerDied","Data":"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7"} Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.064899 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" event={"ID":"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976","Type":"ContainerDied","Data":"4c922e95f654a368eeae2e9797d487893d430a61e39018e8a09a7ce75a901622"} Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.064919 4939 scope.go:117] "RemoveContainer" containerID="dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.065040 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.076646 4939 generic.go:334] "Generic (PLEG): container finished" podID="b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" containerID="60447d0394151a5ef4832f2882a562382589b6ce445fefd29fb1b412eba7df48" exitCode=0 Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.076715 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" event={"ID":"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de","Type":"ContainerDied","Data":"60447d0394151a5ef4832f2882a562382589b6ce445fefd29fb1b412eba7df48"} Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.086587 4939 generic.go:334] "Generic (PLEG): container finished" podID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerID="29c5895ba8cc7b3376bb2b56308cd5fb97e3b3b11aa3c4cee3a6c591ae5dcf11" exitCode=0 Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.089733 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" event={"ID":"f8d184fd-e8e8-43c7-8961-fb9aa266b149","Type":"ContainerDied","Data":"29c5895ba8cc7b3376bb2b56308cd5fb97e3b3b11aa3c4cee3a6c591ae5dcf11"} Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.141219 4939 scope.go:117] "RemoveContainer" containerID="dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.146647 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7\": container with ID starting with dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7 not found: ID does not exist" containerID="dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.146704 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7"} err="failed to get container status \"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7\": rpc error: code = NotFound desc = could not find container \"dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7\": container with ID starting with dca549a50301be0f5837fe8141cd829e92bf3432a310f97d85fc545aa5deb4c7 not found: ID does not exist" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.147380 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.147432 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.148773 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca\") pod \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.148856 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert\") pod \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.149326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfvzj\" (UniqueName: \"kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj\") pod \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.149427 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config\") pod \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\" (UID: \"0afb8cbc-8a92-4ee3-9541-ff4d2faa0976\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.149754 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.150355 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca" (OuterVolumeSpecName: "client-ca") pod "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" (UID: "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.150563 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config" (OuterVolumeSpecName: "config") pod "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" (UID: "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.151400 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.65138593 +0000 UTC m=+222.250573551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.159880 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2x4rk" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.161515 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj" (OuterVolumeSpecName: "kube-api-access-bfvzj") pod "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" (UID: "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976"). InnerVolumeSpecName "kube-api-access-bfvzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.193773 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zp5q" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.205896 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" (UID: "0afb8cbc-8a92-4ee3-9541-ff4d2faa0976"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.240050 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.265623 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmz4v\" (UniqueName: \"kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v\") pod \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.265888 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.265986 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles\") pod \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.266318 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config\") pod \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.266355 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca\") pod \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.266378 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert\") pod \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\" (UID: \"f8d184fd-e8e8-43c7-8961-fb9aa266b149\") " Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.268891 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfvzj\" (UniqueName: \"kubernetes.io/projected/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-kube-api-access-bfvzj\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.268928 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.268942 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.268955 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.275422 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config" (OuterVolumeSpecName: "config") pod "f8d184fd-e8e8-43c7-8961-fb9aa266b149" (UID: "f8d184fd-e8e8-43c7-8961-fb9aa266b149"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.278786 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f8d184fd-e8e8-43c7-8961-fb9aa266b149" (UID: "f8d184fd-e8e8-43c7-8961-fb9aa266b149"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.280024 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f8d184fd-e8e8-43c7-8961-fb9aa266b149" (UID: "f8d184fd-e8e8-43c7-8961-fb9aa266b149"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.280155 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.780135071 +0000 UTC m=+222.379322692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.281796 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca" (OuterVolumeSpecName: "client-ca") pod "f8d184fd-e8e8-43c7-8961-fb9aa266b149" (UID: "f8d184fd-e8e8-43c7-8961-fb9aa266b149"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.313681 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v" (OuterVolumeSpecName: "kube-api-access-cmz4v") pod "f8d184fd-e8e8-43c7-8961-fb9aa266b149" (UID: "f8d184fd-e8e8-43c7-8961-fb9aa266b149"). InnerVolumeSpecName "kube-api-access-cmz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370634 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370776 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmz4v\" (UniqueName: \"kubernetes.io/projected/f8d184fd-e8e8-43c7-8961-fb9aa266b149-kube-api-access-cmz4v\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370790 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370800 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370810 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8d184fd-e8e8-43c7-8961-fb9aa266b149-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.370818 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d184fd-e8e8-43c7-8961-fb9aa266b149-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.371107 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.871095416 +0000 UTC m=+222.470283037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.397374 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.400968 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8jwp6"] Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.472026 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.472163 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.972133484 +0000 UTC m=+222.571321115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.472371 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.472713 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:57.972701351 +0000 UTC m=+222.571888972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.573673 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.574142 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.07412101 +0000 UTC m=+222.673308631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.574492 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.574892 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.074870621 +0000 UTC m=+222.674058242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.675300 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.675796 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.175777305 +0000 UTC m=+222.774964926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.766014 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:57 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:57 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:57 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.766070 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.767894 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nmbcl" Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.777563 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.777943 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.277929125 +0000 UTC m=+222.877116746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.878897 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.879144 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.379124148 +0000 UTC m=+222.978311769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.879241 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.879626 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.379615082 +0000 UTC m=+222.978802703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.980297 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.980467 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.480450044 +0000 UTC m=+223.079637665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:57 crc kubenswrapper[4939]: I0318 15:40:57.980607 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:57 crc kubenswrapper[4939]: E0318 15:40:57.980869 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.480862406 +0000 UTC m=+223.080050027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.081523 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.081665 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.581635376 +0000 UTC m=+223.180823017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.081905 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.082200 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.582188612 +0000 UTC m=+223.181376223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.120563 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" event={"ID":"f8d184fd-e8e8-43c7-8961-fb9aa266b149","Type":"ContainerDied","Data":"cc9fa7232c2448865dfff229f0bd5942bccd02a0cf423da449ebddf895d3b961"} Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.120626 4939 scope.go:117] "RemoveContainer" containerID="29c5895ba8cc7b3376bb2b56308cd5fb97e3b3b11aa3c4cee3a6c591ae5dcf11" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.120722 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z46b4" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.128046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa2cde87-c2c4-409a-a823-9884cd51ea9a","Type":"ContainerStarted","Data":"c59bfe1417cc860626a165ebb485560d242e4df980f4fbfe3a0da0327507fe58"} Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.128163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa2cde87-c2c4-409a-a823-9884cd51ea9a","Type":"ContainerStarted","Data":"a970366022b027faa88c09a68f524b29e000499fa4e6dbbcf3603bc93e6ceb6d"} Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.140578 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" path="/var/lib/kubelet/pods/0afb8cbc-8a92-4ee3-9541-ff4d2faa0976/volumes" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.170000 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.169981366 podStartE2EDuration="3.169981366s" podCreationTimestamp="2026-03-18 15:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:58.155827876 +0000 UTC m=+222.755015497" watchObservedRunningTime="2026-03-18 15:40:58.169981366 +0000 UTC m=+222.769168987" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.172868 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.184845 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.186617 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.686590067 +0000 UTC m=+223.285777698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.193380 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z46b4"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.288634 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.289086 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.789070027 +0000 UTC m=+223.388257648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.393434 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.393789 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.893774251 +0000 UTC m=+223.492961872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.393807 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.393975 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" containerName="route-controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.393986 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" containerName="route-controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.393998 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.394004 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.394076 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" containerName="controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.394093 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afb8cbc-8a92-4ee3-9541-ff4d2faa0976" containerName="route-controller-manager" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.394747 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.398403 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.405849 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.441991 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.442960 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.450403 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.450748 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.451191 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.451927 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.453553 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.454843 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.462534 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.469825 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.469990 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.476482 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.476805 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.476855 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.476943 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.477074 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.477388 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.477565 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.479931 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.488891 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.496848 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.496950 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.496993 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497016 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497127 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497154 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497205 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vnps\" (UniqueName: \"kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.497318 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4ps\" (UniqueName: \"kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.498072 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:58.998031742 +0000 UTC m=+223.597219363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.604951 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume\") pod \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605369 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume\") pod \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605539 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605580 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt7dh\" (UniqueName: \"kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh\") pod \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\" (UID: \"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de\") " Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605782 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4ps\" (UniqueName: \"kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605890 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605936 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605954 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605969 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.605999 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606033 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606048 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vnps\" (UniqueName: \"kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606072 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvml9\" (UniqueName: \"kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606091 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.606198 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.106180096 +0000 UTC m=+223.705367717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.606468 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" (UID: "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.607884 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.608459 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.609780 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.610972 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.614310 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" (UID: "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.614978 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.618459 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.623286 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh" (OuterVolumeSpecName: "kube-api-access-dt7dh") pod "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" (UID: "b5f6faf0-bf2d-4679-bd53-5a1f529ad2de"). InnerVolumeSpecName "kube-api-access-dt7dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.642993 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4ps\" (UniqueName: \"kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps\") pod \"controller-manager-6c5bffdf46-psbxd\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.644310 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vnps\" (UniqueName: \"kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps\") pod \"certified-operators-4cq6w\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707141 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707206 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707239 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvml9\" (UniqueName: \"kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707257 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707314 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707357 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707366 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt7dh\" (UniqueName: \"kubernetes.io/projected/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-kube-api-access-dt7dh\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.707376 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.707620 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.207610425 +0000 UTC m=+223.806798046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.708435 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.709273 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.714389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.732392 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvml9\" (UniqueName: \"kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9\") pod \"route-controller-manager-78bd474b69-mjlgb\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.743386 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.761801 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:58 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:58 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:58 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.761862 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.779177 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.779413 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" containerName="collect-profiles" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.779431 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" containerName="collect-profiles" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.779564 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" containerName="collect-profiles" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.780288 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.785051 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.805984 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.807750 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.807938 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.307912481 +0000 UTC m=+223.907100102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.808011 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.808328 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.308316463 +0000 UTC m=+223.907504084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.845230 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.908956 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.909112 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.409081673 +0000 UTC m=+224.008269294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.909236 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.909300 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.909335 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.909364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbcf\" (UniqueName: \"kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:58 crc kubenswrapper[4939]: E0318 15:40:58.909586 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.409575607 +0000 UTC m=+224.008763288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.922518 4939 ???:1] "http: TLS handshake error from 192.168.126.11:58514: no serving certificate available for the kubelet" Mar 18 15:40:58 crc kubenswrapper[4939]: I0318 15:40:58.995199 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.010460 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.012167 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.512066907 +0000 UTC m=+224.111254528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.013360 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.013454 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.013517 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.013563 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbcf\" (UniqueName: \"kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.014399 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.014983 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.514971841 +0000 UTC m=+224.114159462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.015087 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.015125 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.016056 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.018706 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.048467 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbcf\" (UniqueName: \"kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf\") pod \"certified-operators-4rkpb\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.095950 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.096738 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.099012 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.101259 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.105018 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.105556 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.114904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.115247 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.115457 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf4w\" (UniqueName: \"kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.115566 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.115732 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.615714121 +0000 UTC m=+224.214901742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.145210 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" event={"ID":"2adddc7a-6b85-45dc-abf2-611a810581ad","Type":"ContainerStarted","Data":"b9c057a88c245b44bd22bb9a9c0ffba2a3818d7478edc9836a14e87d48b185bf"} Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.146959 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" event={"ID":"b5f6faf0-bf2d-4679-bd53-5a1f529ad2de","Type":"ContainerDied","Data":"4ca9c2962ee99c02e3e4d691242562435a82743d49c11f3e4e267a8c3e702631"} Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.147008 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca9c2962ee99c02e3e4d691242562435a82743d49c11f3e4e267a8c3e702631" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.147095 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.172471 4939 generic.go:334] "Generic (PLEG): container finished" podID="fa2cde87-c2c4-409a-a823-9884cd51ea9a" containerID="c59bfe1417cc860626a165ebb485560d242e4df980f4fbfe3a0da0327507fe58" exitCode=0 Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.172558 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa2cde87-c2c4-409a-a823-9884cd51ea9a","Type":"ContainerDied","Data":"c59bfe1417cc860626a165ebb485560d242e4df980f4fbfe3a0da0327507fe58"} Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.176477 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.177780 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.197748 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.220618 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.220684 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.221490 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psf4w\" (UniqueName: \"kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.221605 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.221663 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.222383 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.222858 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.223103 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.723087452 +0000 UTC m=+224.322275283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.223320 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.247896 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf4w\" (UniqueName: \"kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w\") pod \"community-operators-thb6s\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324241 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324435 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324478 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324557 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtsh2\" (UniqueName: \"kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324580 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324654 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.324746 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.325744 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.825706526 +0000 UTC m=+224.424894147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.346636 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.362205 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.371988 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.414984 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.426738 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.438202 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.438859 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.439020 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.439122 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtsh2\" (UniqueName: \"kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.439188 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.439475 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.439861 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:40:59.939842863 +0000 UTC m=+224.539030624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.439863 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.461242 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtsh2\" (UniqueName: \"kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2\") pod \"community-operators-8pxb8\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: W0318 15:40:59.466028 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302ccb97_c656_4fa9_926c_8ec5ee3ec25d.slice/crio-9d9141099bf74c12e817343ac3f33a89993dcf16900b2f73af1cb80bab32800a WatchSource:0}: Error finding container 9d9141099bf74c12e817343ac3f33a89993dcf16900b2f73af1cb80bab32800a: Status 404 returned error can't find the container with id 9d9141099bf74c12e817343ac3f33a89993dcf16900b2f73af1cb80bab32800a Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.475700 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.513280 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.545458 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.545599 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.045568926 +0000 UTC m=+224.644756547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.546097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.546731 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.046719999 +0000 UTC m=+224.645907620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.553832 4939 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.656438 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.657408 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.157382186 +0000 UTC m=+224.756569807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.670329 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.760680 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.761097 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.261079491 +0000 UTC m=+224.860267112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.765262 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:40:59 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:40:59 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:40:59 crc kubenswrapper[4939]: healthz check failed Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.765341 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.812546 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.875692 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.876007 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.375945259 +0000 UTC m=+224.975132880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.879334 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.879833 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.880767 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.380749538 +0000 UTC m=+224.979937159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: W0318 15:40:59.882551 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2302988d_0f6e_4e8e_ab46_90eac3a8dffa.slice/crio-1cd501be9810ba8c0dca3554518483655815d0fc21b1a20fb8b730631937e06c WatchSource:0}: Error finding container 1cd501be9810ba8c0dca3554518483655815d0fc21b1a20fb8b730631937e06c: Status 404 returned error can't find the container with id 1cd501be9810ba8c0dca3554518483655815d0fc21b1a20fb8b730631937e06c Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.930148 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.943076 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-txf85" Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.982479 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.982796 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.482759964 +0000 UTC m=+225.081947585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:40:59 crc kubenswrapper[4939]: I0318 15:40:59.983163 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:40:59 crc kubenswrapper[4939]: E0318 15:40:59.983558 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.483540107 +0000 UTC m=+225.082727728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.020557 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.020635 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.021163 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.021188 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.084891 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:41:00 crc kubenswrapper[4939]: E0318 15:41:00.085044 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.585019328 +0000 UTC m=+225.184206949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.085345 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: E0318 15:41:00.085958 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.585950155 +0000 UTC m=+225.185137776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.148556 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d184fd-e8e8-43c7-8961-fb9aa266b149" path="/var/lib/kubelet/pods/f8d184fd-e8e8-43c7-8961-fb9aa266b149/volumes" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.157141 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.157229 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.183480 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.186945 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:41:00 crc kubenswrapper[4939]: E0318 15:41:00.187359 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.687343423 +0000 UTC m=+225.286531044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.194845 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerStarted","Data":"1cd501be9810ba8c0dca3554518483655815d0fc21b1a20fb8b730631937e06c"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.204119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" event={"ID":"2adddc7a-6b85-45dc-abf2-611a810581ad","Type":"ContainerStarted","Data":"defea366fec8ac989c592129df94d0d71cf1d169fe148fdc74f58d098ce9dd91"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.206289 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerStarted","Data":"c30addb0291ddbb3d26f067f476d1b5fee21cc02e8d0cbc60387394b90f76bea"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.208396 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerID="1669e2c9dc115753a190004a95c96f00ac872620162a8a91903bd1c8d0217696" exitCode=0 Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.208439 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerDied","Data":"1669e2c9dc115753a190004a95c96f00ac872620162a8a91903bd1c8d0217696"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.208457 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerStarted","Data":"0164245b72cca98186e2ab94ba39c189a4673d48378c4b8bb7e63f5f28c94476"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.222041 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerID="f8b7508186e9d83bbea7c614274b0fae9803d08b92eab35edc2baac9a4771ad6" exitCode=0 Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.222123 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerDied","Data":"f8b7508186e9d83bbea7c614274b0fae9803d08b92eab35edc2baac9a4771ad6"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.222166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerStarted","Data":"b55a5310b7b6c708a39b2804933f27cd4e133cdc8fc42b35d0d09021ea05e894"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.234652 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1805f6b-135f-47d3-829d-e6e5020c9e22","Type":"ContainerStarted","Data":"e13eb82268862284f7b52a1a97d350b17ade06338097492ba553190fa277510f"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.250779 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" event={"ID":"302ccb97-c656-4fa9-926c-8ec5ee3ec25d","Type":"ContainerStarted","Data":"5b94279c1b072b1ddfa8439fb117c7ce8462cc6a090c605522fd40f28673012d"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.250846 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" event={"ID":"302ccb97-c656-4fa9-926c-8ec5ee3ec25d","Type":"ContainerStarted","Data":"9d9141099bf74c12e817343ac3f33a89993dcf16900b2f73af1cb80bab32800a"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.252280 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.272150 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" event={"ID":"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b","Type":"ContainerStarted","Data":"06efce6fa9c64582d1685608722d669f0db52f0784cdc54e5feaad15018d242c"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.272188 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" event={"ID":"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b","Type":"ContainerStarted","Data":"8ee3bb25dea55406478c646fcad21d4cc0da985003faa716473d21706982ab17"} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.272203 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.304717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: E0318 15:41:00.305759 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:41:00.805743504 +0000 UTC m=+225.404931125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t62m5" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.316108 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.316533 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dbrdq" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.367629 4939 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T15:40:59.553857496Z","Handler":null,"Name":""} Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.382752 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.383156 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.384311 4939 patch_prober.go:28] interesting pod/console-f9d7485db-nqdwc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.384348 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nqdwc" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.391956 4939 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.392000 4939 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.413959 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" podStartSLOduration=4.413939689 podStartE2EDuration="4.413939689s" podCreationTimestamp="2026-03-18 15:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:00.375925607 +0000 UTC m=+224.975113228" watchObservedRunningTime="2026-03-18 15:41:00.413939689 +0000 UTC m=+225.013127310" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.415472 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.431008 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.451112 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.517975 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.520393 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" podStartSLOduration=4.520377723 podStartE2EDuration="4.520377723s" podCreationTimestamp="2026-03-18 15:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:00.518943322 +0000 UTC m=+225.118130943" watchObservedRunningTime="2026-03-18 15:41:00.520377723 +0000 UTC m=+225.119565344" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.529144 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.529195 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.553816 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.591965 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.609873 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.610929 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.613068 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.649174 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t62m5\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.725125 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.725261 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hv7v\" (UniqueName: \"kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.725294 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.734465 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.759260 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.768716 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:41:00 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:41:00 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:41:00 crc kubenswrapper[4939]: healthz check failed Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.768844 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.826785 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hv7v\" (UniqueName: \"kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.826866 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.826942 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.827469 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.827566 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.864309 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.881466 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hv7v\" (UniqueName: \"kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v\") pod \"redhat-marketplace-2qrxf\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.927825 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir\") pod \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.927924 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access\") pod \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\" (UID: \"fa2cde87-c2c4-409a-a823-9884cd51ea9a\") " Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.929386 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa2cde87-c2c4-409a-a823-9884cd51ea9a" (UID: "fa2cde87-c2c4-409a-a823-9884cd51ea9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.936800 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa2cde87-c2c4-409a-a823-9884cd51ea9a" (UID: "fa2cde87-c2c4-409a-a823-9884cd51ea9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.948810 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.968066 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:41:00 crc kubenswrapper[4939]: E0318 15:41:00.968267 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2cde87-c2c4-409a-a823-9884cd51ea9a" containerName="pruner" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.968279 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2cde87-c2c4-409a-a823-9884cd51ea9a" containerName="pruner" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.968399 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2cde87-c2c4-409a-a823-9884cd51ea9a" containerName="pruner" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.969164 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:00 crc kubenswrapper[4939]: I0318 15:41:00.993800 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.029496 4939 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.029542 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa2cde87-c2c4-409a-a823-9884cd51ea9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.131061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.131191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.131267 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n275g\" (UniqueName: \"kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.233438 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.233946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n275g\" (UniqueName: \"kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.234035 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.234679 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.234923 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.263546 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n275g\" (UniqueName: \"kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g\") pod \"redhat-marketplace-kmh9v\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.288835 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.291610 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa2cde87-c2c4-409a-a823-9884cd51ea9a","Type":"ContainerDied","Data":"a970366022b027faa88c09a68f524b29e000499fa4e6dbbcf3603bc93e6ceb6d"} Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.291645 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a970366022b027faa88c09a68f524b29e000499fa4e6dbbcf3603bc93e6ceb6d" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.291712 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.301706 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1805f6b-135f-47d3-829d-e6e5020c9e22","Type":"ContainerStarted","Data":"09facbb84359b43748f3c36f064112eb618e04623f24df533d17708b6110af69"} Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.303833 4939 generic.go:334] "Generic (PLEG): container finished" podID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerID="0a8b368c37f3c983ed89744cf6a2d54905184ccea2fac283c404c8e2de77e104" exitCode=0 Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.303876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerDied","Data":"0a8b368c37f3c983ed89744cf6a2d54905184ccea2fac283c404c8e2de77e104"} Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.309663 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" event={"ID":"2adddc7a-6b85-45dc-abf2-611a810581ad","Type":"ContainerStarted","Data":"3bb5549d4e834816fba1c078efc16d0da76dbd0732e6078aeaa539404674e2c7"} Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.319267 4939 generic.go:334] "Generic (PLEG): container finished" podID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerID="cfd232f01b3b09566807d15d6078096dbdd5837c9630b7a2e14a3bb726912579" exitCode=0 Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.319562 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerDied","Data":"cfd232f01b3b09566807d15d6078096dbdd5837c9630b7a2e14a3bb726912579"} Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.321745 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:41:01 crc kubenswrapper[4939]: W0318 15:41:01.364557 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0048288d_ec58_4cf8_a68a_b73b98db9d01.slice/crio-9bfef898c511e896be4628e78b58fe79b36a8faaf782f80593b690a2d69e8d67 WatchSource:0}: Error finding container 9bfef898c511e896be4628e78b58fe79b36a8faaf782f80593b690a2d69e8d67: Status 404 returned error can't find the container with id 9bfef898c511e896be4628e78b58fe79b36a8faaf782f80593b690a2d69e8d67 Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.370081 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.370066865 podStartE2EDuration="2.370066865s" podCreationTimestamp="2026-03-18 15:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:01.332047783 +0000 UTC m=+225.931235394" watchObservedRunningTime="2026-03-18 15:41:01.370066865 +0000 UTC m=+225.969254486" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.395118 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.452697 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5fdwn" podStartSLOduration=14.452674548 podStartE2EDuration="14.452674548s" podCreationTimestamp="2026-03-18 15:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:01.44859413 +0000 UTC m=+226.047781741" watchObservedRunningTime="2026-03-18 15:41:01.452674548 +0000 UTC m=+226.051862169" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.775541 4939 patch_prober.go:28] interesting pod/router-default-5444994796-9qhf4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:41:01 crc kubenswrapper[4939]: [-]has-synced failed: reason withheld Mar 18 15:41:01 crc kubenswrapper[4939]: [+]process-running ok Mar 18 15:41:01 crc kubenswrapper[4939]: healthz check failed Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.776429 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9qhf4" podUID="665daba4-aaad-4285-9bfb-c58983b35d2a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.793272 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.802235 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.809954 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.813958 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.888222 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:41:01 crc kubenswrapper[4939]: W0318 15:41:01.934790 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c05afb5_9509_4e28_b0f8_ffe522fd31d3.slice/crio-83cc5ae52dae717233192a180c6865837fe45458a00e642bd782ace60d7c70c2 WatchSource:0}: Error finding container 83cc5ae52dae717233192a180c6865837fe45458a00e642bd782ace60d7c70c2: Status 404 returned error can't find the container with id 83cc5ae52dae717233192a180c6865837fe45458a00e642bd782ace60d7c70c2 Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.949898 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktwx\" (UniqueName: \"kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.950020 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:01 crc kubenswrapper[4939]: I0318 15:41:01.950051 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.051781 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.052100 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.052199 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktwx\" (UniqueName: \"kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.052662 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.052885 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.084776 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktwx\" (UniqueName: \"kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx\") pod \"redhat-operators-ldms8\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.139999 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.144993 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.169615 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.171735 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.187897 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.302705 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-49228" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.340823 4939 generic.go:334] "Generic (PLEG): container finished" podID="a1805f6b-135f-47d3-829d-e6e5020c9e22" containerID="09facbb84359b43748f3c36f064112eb618e04623f24df533d17708b6110af69" exitCode=0 Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.340893 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1805f6b-135f-47d3-829d-e6e5020c9e22","Type":"ContainerDied","Data":"09facbb84359b43748f3c36f064112eb618e04623f24df533d17708b6110af69"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.355421 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" event={"ID":"0048288d-ec58-4cf8-a68a-b73b98db9d01","Type":"ContainerStarted","Data":"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.355493 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" event={"ID":"0048288d-ec58-4cf8-a68a-b73b98db9d01","Type":"ContainerStarted","Data":"9bfef898c511e896be4628e78b58fe79b36a8faaf782f80593b690a2d69e8d67"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.356333 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.361458 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.363270 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.363365 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt9l\" (UniqueName: \"kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.376012 4939 generic.go:334] "Generic (PLEG): container finished" podID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerID="7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da" exitCode=0 Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.376169 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerDied","Data":"7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.376227 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerStarted","Data":"83cc5ae52dae717233192a180c6865837fe45458a00e642bd782ace60d7c70c2"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.404251 4939 generic.go:334] "Generic (PLEG): container finished" podID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerID="0e781de8f192733cf4a8681bb266763ea069074c3740d97245f4f629b3b003ba" exitCode=0 Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.404793 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerDied","Data":"0e781de8f192733cf4a8681bb266763ea069074c3740d97245f4f629b3b003ba"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.404875 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerStarted","Data":"3592f60084dc912f5e9d91de141ec648ae05360d9f9fdcfea7caf0debf6c0b33"} Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.415495 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" podStartSLOduration=166.415470848 podStartE2EDuration="2m46.415470848s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:02.406104066 +0000 UTC m=+227.005291707" watchObservedRunningTime="2026-03-18 15:41:02.415470848 +0000 UTC m=+227.014658469" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.470476 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.470601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.470686 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt9l\" (UniqueName: \"kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.471118 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.479518 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.535274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt9l\" (UniqueName: \"kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l\") pod \"redhat-operators-7sxvx\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.593558 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:41:02 crc kubenswrapper[4939]: W0318 15:41:02.621265 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48cb944_8d0d_4169_aa48_947c2654df5a.slice/crio-4a5b80c8de5a70f4ae3f18b2984b7a9c48be07636a76e19472624ad5dd969d9c WatchSource:0}: Error finding container 4a5b80c8de5a70f4ae3f18b2984b7a9c48be07636a76e19472624ad5dd969d9c: Status 404 returned error can't find the container with id 4a5b80c8de5a70f4ae3f18b2984b7a9c48be07636a76e19472624ad5dd969d9c Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.764332 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.768651 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9qhf4" Mar 18 15:41:02 crc kubenswrapper[4939]: I0318 15:41:02.829377 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.317739 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:41:03 crc kubenswrapper[4939]: W0318 15:41:03.338890 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68ecdb5_45f3_4f17_8258_16c252e1cd7f.slice/crio-6daf1abbb9b04eea4701672ada3f243757a8526529510f1edca030b01217adf0 WatchSource:0}: Error finding container 6daf1abbb9b04eea4701672ada3f243757a8526529510f1edca030b01217adf0: Status 404 returned error can't find the container with id 6daf1abbb9b04eea4701672ada3f243757a8526529510f1edca030b01217adf0 Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.411544 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerStarted","Data":"6daf1abbb9b04eea4701672ada3f243757a8526529510f1edca030b01217adf0"} Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.414174 4939 generic.go:334] "Generic (PLEG): container finished" podID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerID="0f5377b1391c25f9593e81559613cd1811d1776d73020ac9ae33e99f0154a19e" exitCode=0 Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.414219 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerDied","Data":"0f5377b1391c25f9593e81559613cd1811d1776d73020ac9ae33e99f0154a19e"} Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.414372 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerStarted","Data":"4a5b80c8de5a70f4ae3f18b2984b7a9c48be07636a76e19472624ad5dd969d9c"} Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.741706 4939 ???:1] "http: TLS handshake error from 192.168.126.11:33188: no serving certificate available for the kubelet" Mar 18 15:41:03 crc kubenswrapper[4939]: I0318 15:41:03.864868 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.019278 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir\") pod \"a1805f6b-135f-47d3-829d-e6e5020c9e22\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.019892 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access\") pod \"a1805f6b-135f-47d3-829d-e6e5020c9e22\" (UID: \"a1805f6b-135f-47d3-829d-e6e5020c9e22\") " Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.019396 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a1805f6b-135f-47d3-829d-e6e5020c9e22" (UID: "a1805f6b-135f-47d3-829d-e6e5020c9e22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.043549 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a1805f6b-135f-47d3-829d-e6e5020c9e22" (UID: "a1805f6b-135f-47d3-829d-e6e5020c9e22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.067919 4939 ???:1] "http: TLS handshake error from 192.168.126.11:33192: no serving certificate available for the kubelet" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.124016 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1805f6b-135f-47d3-829d-e6e5020c9e22-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.124059 4939 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1805f6b-135f-47d3-829d-e6e5020c9e22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.429936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1805f6b-135f-47d3-829d-e6e5020c9e22","Type":"ContainerDied","Data":"e13eb82268862284f7b52a1a97d350b17ade06338097492ba553190fa277510f"} Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.429970 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13eb82268862284f7b52a1a97d350b17ade06338097492ba553190fa277510f" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.430032 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.457315 4939 generic.go:334] "Generic (PLEG): container finished" podID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerID="2ebfff8168ac5efccf638f36a00ef0d03e2001455ccd58c56cdf095d7b153bb3" exitCode=0 Mar 18 15:41:04 crc kubenswrapper[4939]: I0318 15:41:04.457439 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerDied","Data":"2ebfff8168ac5efccf638f36a00ef0d03e2001455ccd58c56cdf095d7b153bb3"} Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.019409 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.020381 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.019484 4939 patch_prober.go:28] interesting pod/downloads-7954f5f757-7gmkn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.020465 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7gmkn" podUID="e65663eb-5c53-4a9f-81d7-6356a33dc7b7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.34:8080/\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.388890 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:41:10 crc kubenswrapper[4939]: I0318 15:41:10.392600 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:41:15 crc kubenswrapper[4939]: I0318 15:41:15.814807 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:41:15 crc kubenswrapper[4939]: I0318 15:41:15.815327 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerName="controller-manager" containerID="cri-o://06efce6fa9c64582d1685608722d669f0db52f0784cdc54e5feaad15018d242c" gracePeriod=30 Mar 18 15:41:15 crc kubenswrapper[4939]: I0318 15:41:15.835103 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:41:15 crc kubenswrapper[4939]: I0318 15:41:15.835356 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerName="route-controller-manager" containerID="cri-o://5b94279c1b072b1ddfa8439fb117c7ce8462cc6a090c605522fd40f28673012d" gracePeriod=30 Mar 18 15:41:16 crc kubenswrapper[4939]: I0318 15:41:16.218597 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:41:16 crc kubenswrapper[4939]: I0318 15:41:16.220646 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:41:16 crc kubenswrapper[4939]: I0318 15:41:16.238155 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4df63d3d-7b3a-46ad-a343-a25e1986fb5e-metrics-certs\") pod \"network-metrics-daemon-zxrzw\" (UID: \"4df63d3d-7b3a-46ad-a343-a25e1986fb5e\") " pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:41:16 crc kubenswrapper[4939]: I0318 15:41:16.259427 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:41:16 crc kubenswrapper[4939]: I0318 15:41:16.267391 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zxrzw" Mar 18 15:41:17 crc kubenswrapper[4939]: I0318 15:41:17.571849 4939 generic.go:334] "Generic (PLEG): container finished" podID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerID="06efce6fa9c64582d1685608722d669f0db52f0784cdc54e5feaad15018d242c" exitCode=0 Mar 18 15:41:17 crc kubenswrapper[4939]: I0318 15:41:17.571932 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" event={"ID":"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b","Type":"ContainerDied","Data":"06efce6fa9c64582d1685608722d669f0db52f0784cdc54e5feaad15018d242c"} Mar 18 15:41:18 crc kubenswrapper[4939]: I0318 15:41:18.787138 4939 patch_prober.go:28] interesting pod/controller-manager-6c5bffdf46-psbxd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 18 15:41:18 crc kubenswrapper[4939]: I0318 15:41:18.787584 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 18 15:41:18 crc kubenswrapper[4939]: I0318 15:41:18.807182 4939 patch_prober.go:28] interesting pod/route-controller-manager-78bd474b69-mjlgb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 18 15:41:18 crc kubenswrapper[4939]: I0318 15:41:18.807285 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 18 15:41:19 crc kubenswrapper[4939]: I0318 15:41:19.583958 4939 generic.go:334] "Generic (PLEG): container finished" podID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerID="5b94279c1b072b1ddfa8439fb117c7ce8462cc6a090c605522fd40f28673012d" exitCode=0 Mar 18 15:41:19 crc kubenswrapper[4939]: I0318 15:41:19.584005 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" event={"ID":"302ccb97-c656-4fa9-926c-8ec5ee3ec25d","Type":"ContainerDied","Data":"5b94279c1b072b1ddfa8439fb117c7ce8462cc6a090c605522fd40f28673012d"} Mar 18 15:41:20 crc kubenswrapper[4939]: I0318 15:41:20.027664 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7gmkn" Mar 18 15:41:20 crc kubenswrapper[4939]: I0318 15:41:20.870920 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:41:22 crc kubenswrapper[4939]: E0318 15:41:22.352034 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 15:41:22 crc kubenswrapper[4939]: E0318 15:41:22.352718 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n275g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kmh9v_openshift-marketplace(4c05afb5-9509-4e28-b0f8-ffe522fd31d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" logger="UnhandledError" Mar 18 15:41:22 crc kubenswrapper[4939]: E0318 15:41:22.353956 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-kmh9v" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" Mar 18 15:41:23 crc kubenswrapper[4939]: I0318 15:41:23.687372 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:41:23 crc kubenswrapper[4939]: I0318 15:41:23.687865 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:41:24 crc kubenswrapper[4939]: I0318 15:41:24.567950 4939 ???:1] "http: TLS handshake error from 192.168.126.11:34528: no serving certificate available for the kubelet" Mar 18 15:41:26 crc kubenswrapper[4939]: E0318 15:41:26.236525 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kmh9v" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.291702 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.293618 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:41:27 crc kubenswrapper[4939]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 15:41:27 crc kubenswrapper[4939]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tbr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564140-vqqr2_openshift-infra(330c585e-3a67-4502-b800-7401df959334): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 15:41:27 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.294793 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" podUID="330c585e-3a67-4502-b800-7401df959334" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.317181 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.320328 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360534 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.360781 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerName="route-controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360791 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerName="route-controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.360804 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerName="controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360810 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerName="controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.360817 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1805f6b-135f-47d3-829d-e6e5020c9e22" containerName="pruner" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360823 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1805f6b-135f-47d3-829d-e6e5020c9e22" containerName="pruner" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360917 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" containerName="controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360928 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" containerName="route-controller-manager" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.360941 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1805f6b-135f-47d3-829d-e6e5020c9e22" containerName="pruner" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.361343 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.367144 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412717 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles\") pod \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412775 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4ps\" (UniqueName: \"kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps\") pod \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config\") pod \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412837 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert\") pod \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412898 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca\") pod \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412947 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvml9\" (UniqueName: \"kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9\") pod \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412966 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert\") pod \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.412981 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca\") pod \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\" (UID: \"302ccb97-c656-4fa9-926c-8ec5ee3ec25d\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413010 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config\") pod \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\" (UID: \"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b\") " Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413133 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413171 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413214 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413266 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.413293 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8nqw\" (UniqueName: \"kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.414167 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca" (OuterVolumeSpecName: "client-ca") pod "302ccb97-c656-4fa9-926c-8ec5ee3ec25d" (UID: "302ccb97-c656-4fa9-926c-8ec5ee3ec25d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.414204 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config" (OuterVolumeSpecName: "config") pod "302ccb97-c656-4fa9-926c-8ec5ee3ec25d" (UID: "302ccb97-c656-4fa9-926c-8ec5ee3ec25d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.414198 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" (UID: "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.414735 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config" (OuterVolumeSpecName: "config") pod "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" (UID: "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.415560 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" (UID: "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.419005 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9" (OuterVolumeSpecName: "kube-api-access-xvml9") pod "302ccb97-c656-4fa9-926c-8ec5ee3ec25d" (UID: "302ccb97-c656-4fa9-926c-8ec5ee3ec25d"). InnerVolumeSpecName "kube-api-access-xvml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.419741 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" (UID: "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.419748 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "302ccb97-c656-4fa9-926c-8ec5ee3ec25d" (UID: "302ccb97-c656-4fa9-926c-8ec5ee3ec25d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.426758 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps" (OuterVolumeSpecName: "kube-api-access-qv4ps") pod "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" (UID: "7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b"). InnerVolumeSpecName "kube-api-access-qv4ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514315 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514661 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8nqw\" (UniqueName: \"kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514699 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514739 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514772 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514828 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvml9\" (UniqueName: \"kubernetes.io/projected/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-kube-api-access-xvml9\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514838 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514847 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514856 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514864 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514873 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4ps\" (UniqueName: \"kubernetes.io/projected/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-kube-api-access-qv4ps\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514883 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514892 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/302ccb97-c656-4fa9-926c-8ec5ee3ec25d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.514902 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.515894 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.516080 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.517255 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.519667 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.530711 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8nqw\" (UniqueName: \"kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw\") pod \"controller-manager-76886d6c6c-gnrr6\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.684073 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.903233 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" event={"ID":"302ccb97-c656-4fa9-926c-8ec5ee3ec25d","Type":"ContainerDied","Data":"9d9141099bf74c12e817343ac3f33a89993dcf16900b2f73af1cb80bab32800a"} Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.903561 4939 scope.go:117] "RemoveContainer" containerID="5b94279c1b072b1ddfa8439fb117c7ce8462cc6a090c605522fd40f28673012d" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.903494 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.913397 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" event={"ID":"7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b","Type":"ContainerDied","Data":"8ee3bb25dea55406478c646fcad21d4cc0da985003faa716473d21706982ab17"} Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.913434 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5bffdf46-psbxd" Mar 18 15:41:27 crc kubenswrapper[4939]: E0318 15:41:27.914526 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" podUID="330c585e-3a67-4502-b800-7401df959334" Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.947535 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.951837 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bd474b69-mjlgb"] Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.954985 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:41:27 crc kubenswrapper[4939]: I0318 15:41:27.957838 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c5bffdf46-psbxd"] Mar 18 15:41:28 crc kubenswrapper[4939]: I0318 15:41:28.138696 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302ccb97-c656-4fa9-926c-8ec5ee3ec25d" path="/var/lib/kubelet/pods/302ccb97-c656-4fa9-926c-8ec5ee3ec25d/volumes" Mar 18 15:41:28 crc kubenswrapper[4939]: I0318 15:41:28.139624 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b" path="/var/lib/kubelet/pods/7ac2f4b3-fcd6-4ee1-8a70-1d893dfe380b/volumes" Mar 18 15:41:28 crc kubenswrapper[4939]: E0318 15:41:28.961298 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 15:41:28 crc kubenswrapper[4939]: E0318 15:41:28.961902 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psf4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-thb6s_openshift-marketplace(c9a596c4-2674-4c46-ab00-c8167b950bc9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:41:28 crc kubenswrapper[4939]: E0318 15:41:28.963235 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-thb6s" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.463702 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.464613 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.466908 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.466947 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.466966 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.467020 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.467236 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.467427 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.482010 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.543692 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.543790 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.543843 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5pc\" (UniqueName: \"kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.543918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.645254 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.645376 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.645443 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5pc\" (UniqueName: \"kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.645707 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.646800 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.647375 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.651206 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.663589 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5pc\" (UniqueName: \"kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc\") pod \"route-controller-manager-6cdcb97f6f-rb5d2\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:29 crc kubenswrapper[4939]: I0318 15:41:29.798877 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:30 crc kubenswrapper[4939]: E0318 15:41:30.619597 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-thb6s" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" Mar 18 15:41:30 crc kubenswrapper[4939]: I0318 15:41:30.704841 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2chlb" Mar 18 15:41:30 crc kubenswrapper[4939]: E0318 15:41:30.736398 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 15:41:30 crc kubenswrapper[4939]: E0318 15:41:30.736595 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vnps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4cq6w_openshift-marketplace(5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:41:30 crc kubenswrapper[4939]: E0318 15:41:30.738090 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4cq6w" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.490820 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.491788 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.495876 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.496137 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.496812 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.581287 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.581339 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.682751 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.682827 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.682917 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.700818 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:32 crc kubenswrapper[4939]: I0318 15:41:32.819476 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:35 crc kubenswrapper[4939]: E0318 15:41:35.294710 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4cq6w" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" Mar 18 15:41:35 crc kubenswrapper[4939]: I0318 15:41:35.790796 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:35 crc kubenswrapper[4939]: I0318 15:41:35.891259 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.087322 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.088098 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.098281 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.140530 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.140574 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.140653 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: E0318 15:41:37.140798 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 15:41:37 crc kubenswrapper[4939]: E0318 15:41:37.140897 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hv7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2qrxf_openshift-marketplace(18db329a-84bc-4bb2-94a4-00053cc542e7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:41:37 crc kubenswrapper[4939]: E0318 15:41:37.142770 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2qrxf" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.208833 4939 scope.go:117] "RemoveContainer" containerID="06efce6fa9c64582d1685608722d669f0db52f0784cdc54e5feaad15018d242c" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.245706 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.245744 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.245825 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.245905 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.245991 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.269191 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access\") pod \"installer-9-crc\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.413397 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.565361 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.599717 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zxrzw"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.669446 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.707418 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.709488 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:37 crc kubenswrapper[4939]: E0318 15:41:37.902994 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a58c5aa_9e72_41c1_832c_6fba2efc3766.slice/crio-conmon-7937a5b83042b5018dab854d1c8484148696c56a6d516ddd9453a7bd3d6958b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a58c5aa_9e72_41c1_832c_6fba2efc3766.slice/crio-7937a5b83042b5018dab854d1c8484148696c56a6d516ddd9453a7bd3d6958b3.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.976689 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerID="7937a5b83042b5018dab854d1c8484148696c56a6d516ddd9453a7bd3d6958b3" exitCode=0 Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.976743 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerDied","Data":"7937a5b83042b5018dab854d1c8484148696c56a6d516ddd9453a7bd3d6958b3"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.983263 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" event={"ID":"4df63d3d-7b3a-46ad-a343-a25e1986fb5e","Type":"ContainerStarted","Data":"3ebe634b995ef822700ea0f0014ebef4de5cdeee25892e4c37cf5fff1025c2fa"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.984701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aefa97a-14cb-4bfa-87b2-fdb4e367c272","Type":"ContainerStarted","Data":"03ef56f6405e3e18f07bf735265123eed6ee294b436fae44d96325c3d896f247"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.985726 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" event={"ID":"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c","Type":"ContainerStarted","Data":"bfdee32ee70844c15b8e742ca301fc3721e2bdc17d7df8666c27668aaa1ff613"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.988320 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" event={"ID":"47260e49-ff74-42b7-a362-4952f3c9c995","Type":"ContainerStarted","Data":"044d0818f94640895600751b0a84cc7e6ce8c231addea5ba5a0f8096698d4eb3"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.991153 4939 generic.go:334] "Generic (PLEG): container finished" podID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerID="b834afdf7bab66e161f134c90f94249c2aa141bfc0841a1ee2ff26ae73ab5027" exitCode=0 Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.991258 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerDied","Data":"b834afdf7bab66e161f134c90f94249c2aa141bfc0841a1ee2ff26ae73ab5027"} Mar 18 15:41:37 crc kubenswrapper[4939]: I0318 15:41:37.998942 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4414f1c6-6e94-4dc2-80df-af1f546ae085","Type":"ContainerStarted","Data":"cb5ada438014ebae8bd290c6edcf171ae7a6a7aaed5b71b531e8cfe530e7ac34"} Mar 18 15:41:38 crc kubenswrapper[4939]: I0318 15:41:38.000919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerStarted","Data":"9878991fdcab666fcf7f68eb87090b92a271fd8539d67282d1342ebbbf5413f1"} Mar 18 15:41:38 crc kubenswrapper[4939]: I0318 15:41:38.004246 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerStarted","Data":"b4acbe31a06ea3f4279b05cf8d7be6cf36033b487140c804ab4bbf2b73e435d0"} Mar 18 15:41:38 crc kubenswrapper[4939]: E0318 15:41:38.007989 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2qrxf" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.011139 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4414f1c6-6e94-4dc2-80df-af1f546ae085","Type":"ContainerStarted","Data":"4126abbef5d1087b8a884958abbfe2c215ce88d6be49fc3d3ffbdcd40553e98f"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.013192 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" event={"ID":"47260e49-ff74-42b7-a362-4952f3c9c995","Type":"ContainerStarted","Data":"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.013313 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" podUID="47260e49-ff74-42b7-a362-4952f3c9c995" containerName="controller-manager" containerID="cri-o://ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5" gracePeriod=30 Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.013371 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.018266 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.020038 4939 generic.go:334] "Generic (PLEG): container finished" podID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerID="9878991fdcab666fcf7f68eb87090b92a271fd8539d67282d1342ebbbf5413f1" exitCode=0 Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.020119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerDied","Data":"9878991fdcab666fcf7f68eb87090b92a271fd8539d67282d1342ebbbf5413f1"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.022931 4939 generic.go:334] "Generic (PLEG): container finished" podID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerID="b4acbe31a06ea3f4279b05cf8d7be6cf36033b487140c804ab4bbf2b73e435d0" exitCode=0 Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.023186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerDied","Data":"b4acbe31a06ea3f4279b05cf8d7be6cf36033b487140c804ab4bbf2b73e435d0"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.025371 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.02536121 podStartE2EDuration="2.02536121s" podCreationTimestamp="2026-03-18 15:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:39.024623179 +0000 UTC m=+263.623810800" watchObservedRunningTime="2026-03-18 15:41:39.02536121 +0000 UTC m=+263.624548831" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.039041 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" event={"ID":"4df63d3d-7b3a-46ad-a343-a25e1986fb5e","Type":"ContainerStarted","Data":"68deb55e102aaa8b8ade3e499d0124b0775b56e0bf6da57b2fb29ed98e14b0c8"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.039110 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zxrzw" event={"ID":"4df63d3d-7b3a-46ad-a343-a25e1986fb5e","Type":"ContainerStarted","Data":"8bf5b40776e15bbe5c46633aa84bea4ab5c650f3fb0a5ba68c81026e5799cf9c"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.044409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aefa97a-14cb-4bfa-87b2-fdb4e367c272","Type":"ContainerStarted","Data":"b9dcff51168c849c927ebeb99b325be2770067125221c6391b3bd983927588ee"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.051129 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" podStartSLOduration=24.051109839 podStartE2EDuration="24.051109839s" podCreationTimestamp="2026-03-18 15:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:39.048225475 +0000 UTC m=+263.647413096" watchObservedRunningTime="2026-03-18 15:41:39.051109839 +0000 UTC m=+263.650297460" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.052602 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" event={"ID":"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c","Type":"ContainerStarted","Data":"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2"} Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.052783 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" podUID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" containerName="route-controller-manager" containerID="cri-o://5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2" gracePeriod=30 Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.052994 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.073348 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.104457 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zxrzw" podStartSLOduration=203.104436349 podStartE2EDuration="3m23.104436349s" podCreationTimestamp="2026-03-18 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:39.103309336 +0000 UTC m=+263.702496957" watchObservedRunningTime="2026-03-18 15:41:39.104436349 +0000 UTC m=+263.703623970" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.128267 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" podStartSLOduration=24.12823142 podStartE2EDuration="24.12823142s" podCreationTimestamp="2026-03-18 15:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:39.125552382 +0000 UTC m=+263.724740003" watchObservedRunningTime="2026-03-18 15:41:39.12823142 +0000 UTC m=+263.727419071" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.144398 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=7.1443782 podStartE2EDuration="7.1443782s" podCreationTimestamp="2026-03-18 15:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:39.143943407 +0000 UTC m=+263.743131028" watchObservedRunningTime="2026-03-18 15:41:39.1443782 +0000 UTC m=+263.743565821" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.524717 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.555453 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:39 crc kubenswrapper[4939]: E0318 15:41:39.555778 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" containerName="route-controller-manager" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.555794 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" containerName="route-controller-manager" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.555930 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" containerName="route-controller-manager" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.556360 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.561301 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590384 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc5pc\" (UniqueName: \"kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc\") pod \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590471 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config\") pod \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590566 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca\") pod \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590650 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert\") pod \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\" (UID: \"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590889 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.590944 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk25t\" (UniqueName: \"kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.591020 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.591059 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.591363 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" (UID: "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.591425 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config" (OuterVolumeSpecName: "config") pod "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" (UID: "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.599734 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" (UID: "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.599798 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc" (OuterVolumeSpecName: "kube-api-access-hc5pc") pod "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" (UID: "5c4175b0-863a-4244-a34d-5f1a9ea8ed4c"). InnerVolumeSpecName "kube-api-access-hc5pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692206 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692258 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692350 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk25t\" (UniqueName: \"kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692405 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc5pc\" (UniqueName: \"kubernetes.io/projected/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-kube-api-access-hc5pc\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692415 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692424 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.692432 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.693128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.694145 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.697008 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.707308 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk25t\" (UniqueName: \"kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t\") pod \"route-controller-manager-865c5d8b5-7nqvf\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.879329 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.947983 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996028 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles\") pod \"47260e49-ff74-42b7-a362-4952f3c9c995\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996111 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config\") pod \"47260e49-ff74-42b7-a362-4952f3c9c995\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996238 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca\") pod \"47260e49-ff74-42b7-a362-4952f3c9c995\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996284 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert\") pod \"47260e49-ff74-42b7-a362-4952f3c9c995\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996311 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8nqw\" (UniqueName: \"kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw\") pod \"47260e49-ff74-42b7-a362-4952f3c9c995\" (UID: \"47260e49-ff74-42b7-a362-4952f3c9c995\") " Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.996958 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca" (OuterVolumeSpecName: "client-ca") pod "47260e49-ff74-42b7-a362-4952f3c9c995" (UID: "47260e49-ff74-42b7-a362-4952f3c9c995"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.997261 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.997446 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config" (OuterVolumeSpecName: "config") pod "47260e49-ff74-42b7-a362-4952f3c9c995" (UID: "47260e49-ff74-42b7-a362-4952f3c9c995"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.997034 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47260e49-ff74-42b7-a362-4952f3c9c995" (UID: "47260e49-ff74-42b7-a362-4952f3c9c995"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.999377 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47260e49-ff74-42b7-a362-4952f3c9c995" (UID: "47260e49-ff74-42b7-a362-4952f3c9c995"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:39 crc kubenswrapper[4939]: I0318 15:41:39.999421 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw" (OuterVolumeSpecName: "kube-api-access-b8nqw") pod "47260e49-ff74-42b7-a362-4952f3c9c995" (UID: "47260e49-ff74-42b7-a362-4952f3c9c995"). InnerVolumeSpecName "kube-api-access-b8nqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.060064 4939 generic.go:334] "Generic (PLEG): container finished" podID="0aefa97a-14cb-4bfa-87b2-fdb4e367c272" containerID="b9dcff51168c849c927ebeb99b325be2770067125221c6391b3bd983927588ee" exitCode=0 Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.060140 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aefa97a-14cb-4bfa-87b2-fdb4e367c272","Type":"ContainerDied","Data":"b9dcff51168c849c927ebeb99b325be2770067125221c6391b3bd983927588ee"} Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.062264 4939 generic.go:334] "Generic (PLEG): container finished" podID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" containerID="5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2" exitCode=0 Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.062340 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" event={"ID":"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c","Type":"ContainerDied","Data":"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2"} Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.062377 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" event={"ID":"5c4175b0-863a-4244-a34d-5f1a9ea8ed4c","Type":"ContainerDied","Data":"bfdee32ee70844c15b8e742ca301fc3721e2bdc17d7df8666c27668aaa1ff613"} Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.062397 4939 scope.go:117] "RemoveContainer" containerID="5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.062299 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.067887 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.068050 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" event={"ID":"47260e49-ff74-42b7-a362-4952f3c9c995","Type":"ContainerDied","Data":"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5"} Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.068182 4939 generic.go:334] "Generic (PLEG): container finished" podID="47260e49-ff74-42b7-a362-4952f3c9c995" containerID="ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5" exitCode=0 Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.068469 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76886d6c6c-gnrr6" event={"ID":"47260e49-ff74-42b7-a362-4952f3c9c995","Type":"ContainerDied","Data":"044d0818f94640895600751b0a84cc7e6ce8c231addea5ba5a0f8096698d4eb3"} Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.097694 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.098672 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8nqw\" (UniqueName: \"kubernetes.io/projected/47260e49-ff74-42b7-a362-4952f3c9c995-kube-api-access-b8nqw\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.098691 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47260e49-ff74-42b7-a362-4952f3c9c995-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.098700 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.098709 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47260e49-ff74-42b7-a362-4952f3c9c995-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.104001 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb97f6f-rb5d2"] Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.111302 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.114044 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76886d6c6c-gnrr6"] Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.139321 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47260e49-ff74-42b7-a362-4952f3c9c995" path="/var/lib/kubelet/pods/47260e49-ff74-42b7-a362-4952f3c9c995/volumes" Mar 18 15:41:40 crc kubenswrapper[4939]: I0318 15:41:40.139834 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4175b0-863a-4244-a34d-5f1a9ea8ed4c" path="/var/lib/kubelet/pods/5c4175b0-863a-4244-a34d-5f1a9ea8ed4c/volumes" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.792590 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.842659 4939 scope.go:117] "RemoveContainer" containerID="5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.843075 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access\") pod \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.843224 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir\") pod \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\" (UID: \"0aefa97a-14cb-4bfa-87b2-fdb4e367c272\") " Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.843484 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0aefa97a-14cb-4bfa-87b2-fdb4e367c272" (UID: "0aefa97a-14cb-4bfa-87b2-fdb4e367c272"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:41:41 crc kubenswrapper[4939]: E0318 15:41:41.843976 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2\": container with ID starting with 5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2 not found: ID does not exist" containerID="5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.844013 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2"} err="failed to get container status \"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2\": rpc error: code = NotFound desc = could not find container \"5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2\": container with ID starting with 5e64392d84110e2834e204c12fd6f998a212575e52c059f3a4e9926f81930aa2 not found: ID does not exist" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.844082 4939 scope.go:117] "RemoveContainer" containerID="ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.866162 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0aefa97a-14cb-4bfa-87b2-fdb4e367c272" (UID: "0aefa97a-14cb-4bfa-87b2-fdb4e367c272"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.877216 4939 scope.go:117] "RemoveContainer" containerID="ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5" Mar 18 15:41:41 crc kubenswrapper[4939]: E0318 15:41:41.877582 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5\": container with ID starting with ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5 not found: ID does not exist" containerID="ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.877627 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5"} err="failed to get container status \"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5\": rpc error: code = NotFound desc = could not find container \"ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5\": container with ID starting with ff04328fd682d6d72978a020d9660add785cf3946ba00451f833521a3c87eda5 not found: ID does not exist" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.945029 4939 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:41 crc kubenswrapper[4939]: I0318 15:41:41.945060 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aefa97a-14cb-4bfa-87b2-fdb4e367c272-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.080571 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.080606 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aefa97a-14cb-4bfa-87b2-fdb4e367c272","Type":"ContainerDied","Data":"03ef56f6405e3e18f07bf735265123eed6ee294b436fae44d96325c3d896f247"} Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.080937 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03ef56f6405e3e18f07bf735265123eed6ee294b436fae44d96325c3d896f247" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.267633 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:42 crc kubenswrapper[4939]: W0318 15:41:42.274045 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731b89b9_729b_4744_9d89_ddfa6801c23d.slice/crio-cb56585c1e2016a27e82504ce3c271d04d4c9b550b961549e131fa8b0ec981bb WatchSource:0}: Error finding container cb56585c1e2016a27e82504ce3c271d04d4c9b550b961549e131fa8b0ec981bb: Status 404 returned error can't find the container with id cb56585c1e2016a27e82504ce3c271d04d4c9b550b961549e131fa8b0ec981bb Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.472331 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:42 crc kubenswrapper[4939]: E0318 15:41:42.472544 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefa97a-14cb-4bfa-87b2-fdb4e367c272" containerName="pruner" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.472555 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefa97a-14cb-4bfa-87b2-fdb4e367c272" containerName="pruner" Mar 18 15:41:42 crc kubenswrapper[4939]: E0318 15:41:42.472591 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47260e49-ff74-42b7-a362-4952f3c9c995" containerName="controller-manager" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.472598 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47260e49-ff74-42b7-a362-4952f3c9c995" containerName="controller-manager" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.472708 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="47260e49-ff74-42b7-a362-4952f3c9c995" containerName="controller-manager" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.472718 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefa97a-14cb-4bfa-87b2-fdb4e367c272" containerName="pruner" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.473165 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.474402 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.474900 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.475479 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.475677 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.480522 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.480897 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.485307 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.492890 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.552449 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.552549 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.552596 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hwvb\" (UniqueName: \"kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.552612 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.552651 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.653816 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.653892 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.654008 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.654106 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.654138 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hwvb\" (UniqueName: \"kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.655081 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.655188 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.655802 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.658953 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.671069 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hwvb\" (UniqueName: \"kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb\") pod \"controller-manager-7dbf7486d-lh2n2\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:42 crc kubenswrapper[4939]: I0318 15:41:42.802810 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.024406 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:43 crc kubenswrapper[4939]: W0318 15:41:43.030833 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7152a2_3cb3_4a4d_8376_fb44659e6487.slice/crio-4578692d6702b3c0cf165701f825fdecb7022ccca8c56d8bb6c3654987fbe9b5 WatchSource:0}: Error finding container 4578692d6702b3c0cf165701f825fdecb7022ccca8c56d8bb6c3654987fbe9b5: Status 404 returned error can't find the container with id 4578692d6702b3c0cf165701f825fdecb7022ccca8c56d8bb6c3654987fbe9b5 Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.093685 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerStarted","Data":"b3a788f2f028f8286335230f98c4d56655b2d654481ee1a12adf98edb1212414"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.098060 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" event={"ID":"731b89b9-729b-4744-9d89-ddfa6801c23d","Type":"ContainerStarted","Data":"cfe33a6ab017a4b6a52f371db5fd4c129c2bf0e2b3c38cdba17240d85c9befc1"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.098148 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.098164 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" event={"ID":"731b89b9-729b-4744-9d89-ddfa6801c23d","Type":"ContainerStarted","Data":"cb56585c1e2016a27e82504ce3c271d04d4c9b550b961549e131fa8b0ec981bb"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.100912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" event={"ID":"bd7152a2-3cb3-4a4d-8376-fb44659e6487","Type":"ContainerStarted","Data":"4578692d6702b3c0cf165701f825fdecb7022ccca8c56d8bb6c3654987fbe9b5"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.103649 4939 generic.go:334] "Generic (PLEG): container finished" podID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerID="911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586" exitCode=0 Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.104112 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerDied","Data":"911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.107086 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerStarted","Data":"86f33d110887e185cbf188eca116313b21b27127eba5c87adf7bbd98c206f967"} Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.115554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rkpb" podStartSLOduration=3.593392259 podStartE2EDuration="45.11553738s" podCreationTimestamp="2026-03-18 15:40:58 +0000 UTC" firstStartedPulling="2026-03-18 15:41:00.218441744 +0000 UTC m=+224.817629365" lastFinishedPulling="2026-03-18 15:41:41.740586865 +0000 UTC m=+266.339774486" observedRunningTime="2026-03-18 15:41:43.112055459 +0000 UTC m=+267.711243080" watchObservedRunningTime="2026-03-18 15:41:43.11553738 +0000 UTC m=+267.714724991" Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.154560 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" podStartSLOduration=8.154522923 podStartE2EDuration="8.154522923s" podCreationTimestamp="2026-03-18 15:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:43.148764106 +0000 UTC m=+267.747951757" watchObservedRunningTime="2026-03-18 15:41:43.154522923 +0000 UTC m=+267.753710564" Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.175398 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8pxb8" podStartSLOduration=6.075010312 podStartE2EDuration="44.175380149s" podCreationTimestamp="2026-03-18 15:40:59 +0000 UTC" firstStartedPulling="2026-03-18 15:41:01.318052428 +0000 UTC m=+225.917240049" lastFinishedPulling="2026-03-18 15:41:39.418422275 +0000 UTC m=+264.017609886" observedRunningTime="2026-03-18 15:41:43.162554246 +0000 UTC m=+267.761741897" watchObservedRunningTime="2026-03-18 15:41:43.175380149 +0000 UTC m=+267.774567810" Mar 18 15:41:43 crc kubenswrapper[4939]: I0318 15:41:43.225604 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:44 crc kubenswrapper[4939]: I0318 15:41:44.114808 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" event={"ID":"bd7152a2-3cb3-4a4d-8376-fb44659e6487","Type":"ContainerStarted","Data":"afc02aee9fa5d8054c65bd7534bb825e964b9171982fcd3a544e7b99a9099779"} Mar 18 15:41:44 crc kubenswrapper[4939]: I0318 15:41:44.128848 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" podStartSLOduration=9.128820573 podStartE2EDuration="9.128820573s" podCreationTimestamp="2026-03-18 15:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:44.128456513 +0000 UTC m=+268.727644134" watchObservedRunningTime="2026-03-18 15:41:44.128820573 +0000 UTC m=+268.728008194" Mar 18 15:41:45 crc kubenswrapper[4939]: I0318 15:41:45.121450 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:45 crc kubenswrapper[4939]: I0318 15:41:45.126122 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:47 crc kubenswrapper[4939]: I0318 15:41:47.133399 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerStarted","Data":"8a01a26becb6b6af79221d574cb0ecd40793b49474385bace7a345aa86baaf4e"} Mar 18 15:41:47 crc kubenswrapper[4939]: I0318 15:41:47.163736 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7sxvx" podStartSLOduration=12.0137471 podStartE2EDuration="45.163709258s" podCreationTimestamp="2026-03-18 15:41:02 +0000 UTC" firstStartedPulling="2026-03-18 15:41:12.571818328 +0000 UTC m=+237.171005989" lastFinishedPulling="2026-03-18 15:41:45.721780526 +0000 UTC m=+270.320968147" observedRunningTime="2026-03-18 15:41:47.159759334 +0000 UTC m=+271.758946975" watchObservedRunningTime="2026-03-18 15:41:47.163709258 +0000 UTC m=+271.762896899" Mar 18 15:41:48 crc kubenswrapper[4939]: I0318 15:41:48.643874 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:41:49 crc kubenswrapper[4939]: I0318 15:41:49.106220 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:49 crc kubenswrapper[4939]: I0318 15:41:49.106731 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:49 crc kubenswrapper[4939]: I0318 15:41:49.514194 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:49 crc kubenswrapper[4939]: I0318 15:41:49.514252 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.112556 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.113437 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.155668 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerStarted","Data":"52976223ba826460bf2666c07bb9d6d8a2922a5bd33bde71b40ac2215e76310a"} Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.167594 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.223807 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ldms8" podStartSLOduration=3.927360609 podStartE2EDuration="49.223777595s" podCreationTimestamp="2026-03-18 15:41:01 +0000 UTC" firstStartedPulling="2026-03-18 15:41:03.418811561 +0000 UTC m=+228.017999182" lastFinishedPulling="2026-03-18 15:41:48.715228547 +0000 UTC m=+273.314416168" observedRunningTime="2026-03-18 15:41:50.217197304 +0000 UTC m=+274.816384945" watchObservedRunningTime="2026-03-18 15:41:50.223777595 +0000 UTC m=+274.822965216" Mar 18 15:41:50 crc kubenswrapper[4939]: I0318 15:41:50.227427 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.146117 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.146549 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.167833 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerStarted","Data":"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b"} Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.188243 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kmh9v" podStartSLOduration=3.942841447 podStartE2EDuration="52.188218027s" podCreationTimestamp="2026-03-18 15:41:00 +0000 UTC" firstStartedPulling="2026-03-18 15:41:02.387779115 +0000 UTC m=+226.986966736" lastFinishedPulling="2026-03-18 15:41:50.633155695 +0000 UTC m=+275.232343316" observedRunningTime="2026-03-18 15:41:52.18694646 +0000 UTC m=+276.786134081" watchObservedRunningTime="2026-03-18 15:41:52.188218027 +0000 UTC m=+276.787405658" Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.204207 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.204425 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8pxb8" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="registry-server" containerID="cri-o://86f33d110887e185cbf188eca116313b21b27127eba5c87adf7bbd98c206f967" gracePeriod=2 Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.408810 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.409188 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rkpb" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="registry-server" containerID="cri-o://b3a788f2f028f8286335230f98c4d56655b2d654481ee1a12adf98edb1212414" gracePeriod=2 Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.830622 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:52 crc kubenswrapper[4939]: I0318 15:41:52.830677 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.175685 4939 generic.go:334] "Generic (PLEG): container finished" podID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerID="86f33d110887e185cbf188eca116313b21b27127eba5c87adf7bbd98c206f967" exitCode=0 Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.175755 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerDied","Data":"86f33d110887e185cbf188eca116313b21b27127eba5c87adf7bbd98c206f967"} Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.178743 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerID="b3a788f2f028f8286335230f98c4d56655b2d654481ee1a12adf98edb1212414" exitCode=0 Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.178775 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerDied","Data":"b3a788f2f028f8286335230f98c4d56655b2d654481ee1a12adf98edb1212414"} Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.200829 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ldms8" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="registry-server" probeResult="failure" output=< Mar 18 15:41:53 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 15:41:53 crc kubenswrapper[4939]: > Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.298795 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.303323 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404434 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities\") pod \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404565 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtsh2\" (UniqueName: \"kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2\") pod \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404613 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbcf\" (UniqueName: \"kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf\") pod \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404639 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content\") pod \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\" (UID: \"2302988d-0f6e-4e8e-ab46-90eac3a8dffa\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404682 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities\") pod \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.404713 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content\") pod \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\" (UID: \"9a58c5aa-9e72-41c1-832c-6fba2efc3766\") " Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.406809 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities" (OuterVolumeSpecName: "utilities") pod "9a58c5aa-9e72-41c1-832c-6fba2efc3766" (UID: "9a58c5aa-9e72-41c1-832c-6fba2efc3766"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.410428 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities" (OuterVolumeSpecName: "utilities") pod "2302988d-0f6e-4e8e-ab46-90eac3a8dffa" (UID: "2302988d-0f6e-4e8e-ab46-90eac3a8dffa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.412988 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf" (OuterVolumeSpecName: "kube-api-access-tgbcf") pod "9a58c5aa-9e72-41c1-832c-6fba2efc3766" (UID: "9a58c5aa-9e72-41c1-832c-6fba2efc3766"). InnerVolumeSpecName "kube-api-access-tgbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.415648 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2" (OuterVolumeSpecName: "kube-api-access-xtsh2") pod "2302988d-0f6e-4e8e-ab46-90eac3a8dffa" (UID: "2302988d-0f6e-4e8e-ab46-90eac3a8dffa"). InnerVolumeSpecName "kube-api-access-xtsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.460380 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a58c5aa-9e72-41c1-832c-6fba2efc3766" (UID: "9a58c5aa-9e72-41c1-832c-6fba2efc3766"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.469776 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2302988d-0f6e-4e8e-ab46-90eac3a8dffa" (UID: "2302988d-0f6e-4e8e-ab46-90eac3a8dffa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506618 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506658 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtsh2\" (UniqueName: \"kubernetes.io/projected/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-kube-api-access-xtsh2\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506677 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbcf\" (UniqueName: \"kubernetes.io/projected/9a58c5aa-9e72-41c1-832c-6fba2efc3766-kube-api-access-tgbcf\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506691 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2302988d-0f6e-4e8e-ab46-90eac3a8dffa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506704 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.506715 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a58c5aa-9e72-41c1-832c-6fba2efc3766-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.687443 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.687516 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.687555 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.688081 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.688129 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999" gracePeriod=600 Mar 18 15:41:53 crc kubenswrapper[4939]: I0318 15:41:53.890114 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sxvx" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="registry-server" probeResult="failure" output=< Mar 18 15:41:53 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 15:41:53 crc kubenswrapper[4939]: > Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.092678 4939 csr.go:261] certificate signing request csr-dmc9k is approved, waiting to be issued Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.104800 4939 csr.go:257] certificate signing request csr-dmc9k is issued Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.192404 4939 generic.go:334] "Generic (PLEG): container finished" podID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerID="bb9223b64b2b3a13d35dc8b8ee61373d612afbfc8ae39a754dee7646378e840f" exitCode=0 Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.192495 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerDied","Data":"bb9223b64b2b3a13d35dc8b8ee61373d612afbfc8ae39a754dee7646378e840f"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.195125 4939 generic.go:334] "Generic (PLEG): container finished" podID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerID="817c1dc723c488a3a50f64c6f4c2aae7334ad8dfd652641a2f7f503f0db26bab" exitCode=0 Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.195193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerDied","Data":"817c1dc723c488a3a50f64c6f4c2aae7334ad8dfd652641a2f7f503f0db26bab"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.199277 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkpb" event={"ID":"9a58c5aa-9e72-41c1-832c-6fba2efc3766","Type":"ContainerDied","Data":"0164245b72cca98186e2ab94ba39c189a4673d48378c4b8bb7e63f5f28c94476"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.199334 4939 scope.go:117] "RemoveContainer" containerID="b3a788f2f028f8286335230f98c4d56655b2d654481ee1a12adf98edb1212414" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.199292 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkpb" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.203192 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999" exitCode=0 Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.203246 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.203273 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.205429 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerID="01a5091740014a14cf9305e910c86f71f2e8880ca532c040a09ea606595e9399" exitCode=0 Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.205493 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerDied","Data":"01a5091740014a14cf9305e910c86f71f2e8880ca532c040a09ea606595e9399"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.210239 4939 generic.go:334] "Generic (PLEG): container finished" podID="330c585e-3a67-4502-b800-7401df959334" containerID="0cc2ec072a65103eaf5e64c00a21cb6506b62b3c10a0d3c877ca1d99851042fa" exitCode=0 Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.210357 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" event={"ID":"330c585e-3a67-4502-b800-7401df959334","Type":"ContainerDied","Data":"0cc2ec072a65103eaf5e64c00a21cb6506b62b3c10a0d3c877ca1d99851042fa"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.220054 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxb8" event={"ID":"2302988d-0f6e-4e8e-ab46-90eac3a8dffa","Type":"ContainerDied","Data":"1cd501be9810ba8c0dca3554518483655815d0fc21b1a20fb8b730631937e06c"} Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.220135 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxb8" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.242398 4939 scope.go:117] "RemoveContainer" containerID="7937a5b83042b5018dab854d1c8484148696c56a6d516ddd9453a7bd3d6958b3" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.268481 4939 scope.go:117] "RemoveContainer" containerID="1669e2c9dc115753a190004a95c96f00ac872620162a8a91903bd1c8d0217696" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.304761 4939 scope.go:117] "RemoveContainer" containerID="86f33d110887e185cbf188eca116313b21b27127eba5c87adf7bbd98c206f967" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.377114 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.381172 4939 scope.go:117] "RemoveContainer" containerID="b834afdf7bab66e161f134c90f94249c2aa141bfc0841a1ee2ff26ae73ab5027" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.383782 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rkpb"] Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.399318 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.402419 4939 scope.go:117] "RemoveContainer" containerID="0a8b368c37f3c983ed89744cf6a2d54905184ccea2fac283c404c8e2de77e104" Mar 18 15:41:54 crc kubenswrapper[4939]: I0318 15:41:54.404062 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8pxb8"] Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.106651 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 13:41:47.095913174 +0000 UTC Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.106979 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6021h59m51.988942067s for next certificate rotation Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.580618 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.634041 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tbr6\" (UniqueName: \"kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6\") pod \"330c585e-3a67-4502-b800-7401df959334\" (UID: \"330c585e-3a67-4502-b800-7401df959334\") " Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.641803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6" (OuterVolumeSpecName: "kube-api-access-9tbr6") pod "330c585e-3a67-4502-b800-7401df959334" (UID: "330c585e-3a67-4502-b800-7401df959334"). InnerVolumeSpecName "kube-api-access-9tbr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.735770 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tbr6\" (UniqueName: \"kubernetes.io/projected/330c585e-3a67-4502-b800-7401df959334-kube-api-access-9tbr6\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.824765 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.824966 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" podUID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" containerName="controller-manager" containerID="cri-o://afc02aee9fa5d8054c65bd7534bb825e964b9171982fcd3a544e7b99a9099779" gracePeriod=30 Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.841090 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:55 crc kubenswrapper[4939]: I0318 15:41:55.841346 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" podUID="731b89b9-729b-4744-9d89-ddfa6801c23d" containerName="route-controller-manager" containerID="cri-o://cfe33a6ab017a4b6a52f371db5fd4c129c2bf0e2b3c38cdba17240d85c9befc1" gracePeriod=30 Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.107741 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 10:23:04.637723908 +0000 UTC Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.108048 4939 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6858h41m8.529678193s for next certificate rotation Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.141606 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" path="/var/lib/kubelet/pods/2302988d-0f6e-4e8e-ab46-90eac3a8dffa/volumes" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.142533 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" path="/var/lib/kubelet/pods/9a58c5aa-9e72-41c1-832c-6fba2efc3766/volumes" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.233376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" event={"ID":"330c585e-3a67-4502-b800-7401df959334","Type":"ContainerDied","Data":"519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d"} Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.233420 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="519e39fed9b4a66764ac9c2229df0502d2742b3cbdb487c5a10a76cdfffaa86d" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.233488 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-vqqr2" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.245919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerStarted","Data":"872c45e71dd57ffa689269ba07b582501ffd82f73b9c1b6656179ab3f808ba23"} Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.261566 4939 generic.go:334] "Generic (PLEG): container finished" podID="731b89b9-729b-4744-9d89-ddfa6801c23d" containerID="cfe33a6ab017a4b6a52f371db5fd4c129c2bf0e2b3c38cdba17240d85c9befc1" exitCode=0 Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.261647 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" event={"ID":"731b89b9-729b-4744-9d89-ddfa6801c23d","Type":"ContainerDied","Data":"cfe33a6ab017a4b6a52f371db5fd4c129c2bf0e2b3c38cdba17240d85c9befc1"} Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.263124 4939 generic.go:334] "Generic (PLEG): container finished" podID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" containerID="afc02aee9fa5d8054c65bd7534bb825e964b9171982fcd3a544e7b99a9099779" exitCode=0 Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.263159 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" event={"ID":"bd7152a2-3cb3-4a4d-8376-fb44659e6487","Type":"ContainerDied","Data":"afc02aee9fa5d8054c65bd7534bb825e964b9171982fcd3a544e7b99a9099779"} Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.274618 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4cq6w" podStartSLOduration=3.515450583 podStartE2EDuration="58.274599736s" podCreationTimestamp="2026-03-18 15:40:58 +0000 UTC" firstStartedPulling="2026-03-18 15:41:00.228747032 +0000 UTC m=+224.827934653" lastFinishedPulling="2026-03-18 15:41:54.987896185 +0000 UTC m=+279.587083806" observedRunningTime="2026-03-18 15:41:56.272684781 +0000 UTC m=+280.871872402" watchObservedRunningTime="2026-03-18 15:41:56.274599736 +0000 UTC m=+280.873787357" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.359240 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.446628 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca\") pod \"731b89b9-729b-4744-9d89-ddfa6801c23d\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.446962 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert\") pod \"731b89b9-729b-4744-9d89-ddfa6801c23d\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.447011 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk25t\" (UniqueName: \"kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t\") pod \"731b89b9-729b-4744-9d89-ddfa6801c23d\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.447071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config\") pod \"731b89b9-729b-4744-9d89-ddfa6801c23d\" (UID: \"731b89b9-729b-4744-9d89-ddfa6801c23d\") " Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.447458 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca" (OuterVolumeSpecName: "client-ca") pod "731b89b9-729b-4744-9d89-ddfa6801c23d" (UID: "731b89b9-729b-4744-9d89-ddfa6801c23d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.447747 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config" (OuterVolumeSpecName: "config") pod "731b89b9-729b-4744-9d89-ddfa6801c23d" (UID: "731b89b9-729b-4744-9d89-ddfa6801c23d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.452942 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t" (OuterVolumeSpecName: "kube-api-access-hk25t") pod "731b89b9-729b-4744-9d89-ddfa6801c23d" (UID: "731b89b9-729b-4744-9d89-ddfa6801c23d"). InnerVolumeSpecName "kube-api-access-hk25t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.453638 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "731b89b9-729b-4744-9d89-ddfa6801c23d" (UID: "731b89b9-729b-4744-9d89-ddfa6801c23d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.548377 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.548416 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/731b89b9-729b-4744-9d89-ddfa6801c23d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.548430 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731b89b9-729b-4744-9d89-ddfa6801c23d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.548438 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk25t\" (UniqueName: \"kubernetes.io/projected/731b89b9-729b-4744-9d89-ddfa6801c23d-kube-api-access-hk25t\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:56 crc kubenswrapper[4939]: I0318 15:41:56.940857 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.056865 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca\") pod \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057163 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config\") pod \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057249 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles\") pod \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057341 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert\") pod \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057452 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hwvb\" (UniqueName: \"kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb\") pod \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\" (UID: \"bd7152a2-3cb3-4a4d-8376-fb44659e6487\") " Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057601 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd7152a2-3cb3-4a4d-8376-fb44659e6487" (UID: "bd7152a2-3cb3-4a4d-8376-fb44659e6487"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057795 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd7152a2-3cb3-4a4d-8376-fb44659e6487" (UID: "bd7152a2-3cb3-4a4d-8376-fb44659e6487"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057877 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.057921 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config" (OuterVolumeSpecName: "config") pod "bd7152a2-3cb3-4a4d-8376-fb44659e6487" (UID: "bd7152a2-3cb3-4a4d-8376-fb44659e6487"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.061830 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb" (OuterVolumeSpecName: "kube-api-access-9hwvb") pod "bd7152a2-3cb3-4a4d-8376-fb44659e6487" (UID: "bd7152a2-3cb3-4a4d-8376-fb44659e6487"). InnerVolumeSpecName "kube-api-access-9hwvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.062689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd7152a2-3cb3-4a4d-8376-fb44659e6487" (UID: "bd7152a2-3cb3-4a4d-8376-fb44659e6487"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.159149 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hwvb\" (UniqueName: \"kubernetes.io/projected/bd7152a2-3cb3-4a4d-8376-fb44659e6487-kube-api-access-9hwvb\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.159202 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.159215 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd7152a2-3cb3-4a4d-8376-fb44659e6487-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.159227 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7152a2-3cb3-4a4d-8376-fb44659e6487-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.269631 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.269629 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf" event={"ID":"731b89b9-729b-4744-9d89-ddfa6801c23d","Type":"ContainerDied","Data":"cb56585c1e2016a27e82504ce3c271d04d4c9b550b961549e131fa8b0ec981bb"} Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.269756 4939 scope.go:117] "RemoveContainer" containerID="cfe33a6ab017a4b6a52f371db5fd4c129c2bf0e2b3c38cdba17240d85c9befc1" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.270785 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.270815 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbf7486d-lh2n2" event={"ID":"bd7152a2-3cb3-4a4d-8376-fb44659e6487","Type":"ContainerDied","Data":"4578692d6702b3c0cf165701f825fdecb7022ccca8c56d8bb6c3654987fbe9b5"} Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.277352 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerStarted","Data":"fced34e9b98f2f294b56f8f82e7b4a47572a846eb794f12cab4a66405ce6d3a8"} Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.283949 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerStarted","Data":"0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357"} Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.295481 4939 scope.go:117] "RemoveContainer" containerID="afc02aee9fa5d8054c65bd7534bb825e964b9171982fcd3a544e7b99a9099779" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.315214 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qrxf" podStartSLOduration=3.532423704 podStartE2EDuration="57.315188512s" podCreationTimestamp="2026-03-18 15:41:00 +0000 UTC" firstStartedPulling="2026-03-18 15:41:02.409067272 +0000 UTC m=+227.008254893" lastFinishedPulling="2026-03-18 15:41:56.19183208 +0000 UTC m=+280.791019701" observedRunningTime="2026-03-18 15:41:57.313886595 +0000 UTC m=+281.913074236" watchObservedRunningTime="2026-03-18 15:41:57.315188512 +0000 UTC m=+281.914376173" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.334252 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.339690 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dbf7486d-lh2n2"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.351079 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-thb6s" podStartSLOduration=4.845418672 podStartE2EDuration="59.351060145s" podCreationTimestamp="2026-03-18 15:40:58 +0000 UTC" firstStartedPulling="2026-03-18 15:41:01.331443496 +0000 UTC m=+225.930631117" lastFinishedPulling="2026-03-18 15:41:55.837084969 +0000 UTC m=+280.436272590" observedRunningTime="2026-03-18 15:41:57.351048165 +0000 UTC m=+281.950235786" watchObservedRunningTime="2026-03-18 15:41:57.351060145 +0000 UTC m=+281.950247766" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.365113 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.370377 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865c5d8b5-7nqvf"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.480785 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj"] Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481055 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" containerName="controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481068 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" containerName="controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481081 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481087 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481101 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="extract-utilities" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481107 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="extract-utilities" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481115 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731b89b9-729b-4744-9d89-ddfa6801c23d" containerName="route-controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481121 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="731b89b9-729b-4744-9d89-ddfa6801c23d" containerName="route-controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481129 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="extract-content" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481134 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="extract-content" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481150 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330c585e-3a67-4502-b800-7401df959334" containerName="oc" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481156 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="330c585e-3a67-4502-b800-7401df959334" containerName="oc" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481163 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481168 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481178 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="extract-content" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481183 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="extract-content" Mar 18 15:41:57 crc kubenswrapper[4939]: E0318 15:41:57.481191 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="extract-utilities" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481196 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="extract-utilities" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481307 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" containerName="controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481317 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="330c585e-3a67-4502-b800-7401df959334" containerName="oc" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481331 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="731b89b9-729b-4744-9d89-ddfa6801c23d" containerName="route-controller-manager" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481340 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a58c5aa-9e72-41c1-832c-6fba2efc3766" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481349 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2302988d-0f6e-4e8e-ab46-90eac3a8dffa" containerName="registry-server" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.481772 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.483588 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69bfc54c76-l58c7"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.484256 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.485097 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.485319 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.485495 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.485590 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.486434 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.486979 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.488616 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.494154 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.494300 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.494454 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.494706 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.494724 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.498688 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69bfc54c76-l58c7"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.499867 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.503616 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj"] Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565410 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565470 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565518 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565547 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565621 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565685 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565746 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565771 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dnfk\" (UniqueName: \"kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.565803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vjx\" (UniqueName: \"kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667339 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667418 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667447 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dnfk\" (UniqueName: \"kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667479 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vjx\" (UniqueName: \"kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667517 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667536 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667552 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667570 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.667596 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.668528 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.669115 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.669293 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.670213 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.670652 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.673340 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.673929 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.685393 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dnfk\" (UniqueName: \"kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk\") pod \"route-controller-manager-59b5c4d47f-4qcwj\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.685703 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vjx\" (UniqueName: \"kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx\") pod \"controller-manager-69bfc54c76-l58c7\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.801422 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:57 crc kubenswrapper[4939]: I0318 15:41:57.806866 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.141361 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731b89b9-729b-4744-9d89-ddfa6801c23d" path="/var/lib/kubelet/pods/731b89b9-729b-4744-9d89-ddfa6801c23d/volumes" Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.142450 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7152a2-3cb3-4a4d-8376-fb44659e6487" path="/var/lib/kubelet/pods/bd7152a2-3cb3-4a4d-8376-fb44659e6487/volumes" Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.237766 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj"] Mar 18 15:41:58 crc kubenswrapper[4939]: W0318 15:41:58.243430 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98d7316_9a8c_47ad_870f_ff1dde95989c.slice/crio-1be28b75feff42da2bfb168fe30400fc61d4ea5e3f2f22bdd0b991262b2b6d9d WatchSource:0}: Error finding container 1be28b75feff42da2bfb168fe30400fc61d4ea5e3f2f22bdd0b991262b2b6d9d: Status 404 returned error can't find the container with id 1be28b75feff42da2bfb168fe30400fc61d4ea5e3f2f22bdd0b991262b2b6d9d Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.309618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" event={"ID":"d98d7316-9a8c-47ad-870f-ff1dde95989c","Type":"ContainerStarted","Data":"1be28b75feff42da2bfb168fe30400fc61d4ea5e3f2f22bdd0b991262b2b6d9d"} Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.328299 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69bfc54c76-l58c7"] Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.743738 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.743804 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:41:58 crc kubenswrapper[4939]: I0318 15:41:58.808710 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.315784 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" event={"ID":"d98d7316-9a8c-47ad-870f-ff1dde95989c","Type":"ContainerStarted","Data":"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb"} Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.316116 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.317187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" event={"ID":"517209f4-d504-46e5-98df-6b67bc2d6656","Type":"ContainerStarted","Data":"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49"} Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.317338 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.317429 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" event={"ID":"517209f4-d504-46e5-98df-6b67bc2d6656","Type":"ContainerStarted","Data":"6c91baef83ee4ea507662546223aea078fb5bcd6a0470f4c716a45438f3878af"} Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.321321 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.321620 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.337312 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" podStartSLOduration=4.33729083 podStartE2EDuration="4.33729083s" podCreationTimestamp="2026-03-18 15:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:59.336338132 +0000 UTC m=+283.935525753" watchObservedRunningTime="2026-03-18 15:41:59.33729083 +0000 UTC m=+283.936478451" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.381102 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.381152 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.426646 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:41:59 crc kubenswrapper[4939]: I0318 15:41:59.444373 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" podStartSLOduration=4.444357772 podStartE2EDuration="4.444357772s" podCreationTimestamp="2026-03-18 15:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:59.392973858 +0000 UTC m=+283.992161479" watchObservedRunningTime="2026-03-18 15:41:59.444357772 +0000 UTC m=+284.043545393" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.128750 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564142-9rlv8"] Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.129486 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.131261 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.131573 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.131746 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.140092 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-9rlv8"] Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.221153 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skhrn\" (UniqueName: \"kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn\") pod \"auto-csr-approver-29564142-9rlv8\" (UID: \"6573991b-28f7-4030-8a1b-734a3a8e37a7\") " pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.322286 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skhrn\" (UniqueName: \"kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn\") pod \"auto-csr-approver-29564142-9rlv8\" (UID: \"6573991b-28f7-4030-8a1b-734a3a8e37a7\") " pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.341690 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skhrn\" (UniqueName: \"kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn\") pod \"auto-csr-approver-29564142-9rlv8\" (UID: \"6573991b-28f7-4030-8a1b-734a3a8e37a7\") " pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.446750 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.884741 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-9rlv8"] Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.950006 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:42:00 crc kubenswrapper[4939]: I0318 15:42:00.950089 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.012064 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.290036 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.290134 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.337396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" event={"ID":"6573991b-28f7-4030-8a1b-734a3a8e37a7","Type":"ContainerStarted","Data":"19e4865cb296824083282a72f5b6ac56ec452b7d1b02417de0646e398fe2c159"} Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.357239 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.388004 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:42:01 crc kubenswrapper[4939]: I0318 15:42:01.408639 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.189476 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.237201 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.342856 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" event={"ID":"6573991b-28f7-4030-8a1b-734a3a8e37a7","Type":"ContainerStarted","Data":"9a2ab29fc786c69f3ca74afa9bf789ad1ca8a360eaed26608a83aab7091a3faf"} Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.359403 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" podStartSLOduration=1.306718625 podStartE2EDuration="2.359362352s" podCreationTimestamp="2026-03-18 15:42:00 +0000 UTC" firstStartedPulling="2026-03-18 15:42:00.904017949 +0000 UTC m=+285.503205590" lastFinishedPulling="2026-03-18 15:42:01.956661696 +0000 UTC m=+286.555849317" observedRunningTime="2026-03-18 15:42:02.355333355 +0000 UTC m=+286.954520986" watchObservedRunningTime="2026-03-18 15:42:02.359362352 +0000 UTC m=+286.958549993" Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.872358 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:42:02 crc kubenswrapper[4939]: I0318 15:42:02.912632 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:42:03 crc kubenswrapper[4939]: I0318 15:42:03.355747 4939 generic.go:334] "Generic (PLEG): container finished" podID="6573991b-28f7-4030-8a1b-734a3a8e37a7" containerID="9a2ab29fc786c69f3ca74afa9bf789ad1ca8a360eaed26608a83aab7091a3faf" exitCode=0 Mar 18 15:42:03 crc kubenswrapper[4939]: I0318 15:42:03.355876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" event={"ID":"6573991b-28f7-4030-8a1b-734a3a8e37a7","Type":"ContainerDied","Data":"9a2ab29fc786c69f3ca74afa9bf789ad1ca8a360eaed26608a83aab7091a3faf"} Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.604747 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.605352 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kmh9v" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="registry-server" containerID="cri-o://c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b" gracePeriod=2 Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.712654 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.788842 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skhrn\" (UniqueName: \"kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn\") pod \"6573991b-28f7-4030-8a1b-734a3a8e37a7\" (UID: \"6573991b-28f7-4030-8a1b-734a3a8e37a7\") " Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.804632 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn" (OuterVolumeSpecName: "kube-api-access-skhrn") pod "6573991b-28f7-4030-8a1b-734a3a8e37a7" (UID: "6573991b-28f7-4030-8a1b-734a3a8e37a7"). InnerVolumeSpecName "kube-api-access-skhrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.810558 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.810892 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7sxvx" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="registry-server" containerID="cri-o://8a01a26becb6b6af79221d574cb0ecd40793b49474385bace7a345aa86baaf4e" gracePeriod=2 Mar 18 15:42:04 crc kubenswrapper[4939]: I0318 15:42:04.890973 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skhrn\" (UniqueName: \"kubernetes.io/projected/6573991b-28f7-4030-8a1b-734a3a8e37a7-kube-api-access-skhrn\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.240010 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.372221 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" event={"ID":"6573991b-28f7-4030-8a1b-734a3a8e37a7","Type":"ContainerDied","Data":"19e4865cb296824083282a72f5b6ac56ec452b7d1b02417de0646e398fe2c159"} Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.372284 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19e4865cb296824083282a72f5b6ac56ec452b7d1b02417de0646e398fe2c159" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.372289 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-9rlv8" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.375158 4939 generic.go:334] "Generic (PLEG): container finished" podID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerID="c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b" exitCode=0 Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.375212 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerDied","Data":"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b"} Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.375260 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kmh9v" event={"ID":"4c05afb5-9509-4e28-b0f8-ffe522fd31d3","Type":"ContainerDied","Data":"83cc5ae52dae717233192a180c6865837fe45458a00e642bd782ace60d7c70c2"} Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.375282 4939 scope.go:117] "RemoveContainer" containerID="c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.375307 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kmh9v" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.397441 4939 scope.go:117] "RemoveContainer" containerID="911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.400887 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n275g\" (UniqueName: \"kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g\") pod \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.400996 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities\") pod \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.401066 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content\") pod \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\" (UID: \"4c05afb5-9509-4e28-b0f8-ffe522fd31d3\") " Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.402206 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities" (OuterVolumeSpecName: "utilities") pod "4c05afb5-9509-4e28-b0f8-ffe522fd31d3" (UID: "4c05afb5-9509-4e28-b0f8-ffe522fd31d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.406058 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g" (OuterVolumeSpecName: "kube-api-access-n275g") pod "4c05afb5-9509-4e28-b0f8-ffe522fd31d3" (UID: "4c05afb5-9509-4e28-b0f8-ffe522fd31d3"). InnerVolumeSpecName "kube-api-access-n275g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.416029 4939 scope.go:117] "RemoveContainer" containerID="7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.426368 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c05afb5-9509-4e28-b0f8-ffe522fd31d3" (UID: "4c05afb5-9509-4e28-b0f8-ffe522fd31d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.431805 4939 scope.go:117] "RemoveContainer" containerID="c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b" Mar 18 15:42:05 crc kubenswrapper[4939]: E0318 15:42:05.432420 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b\": container with ID starting with c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b not found: ID does not exist" containerID="c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.432488 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b"} err="failed to get container status \"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b\": rpc error: code = NotFound desc = could not find container \"c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b\": container with ID starting with c82790869cb9b00dce37932e865535edf289bfe1d2b4c08ab9c85dfbbd53cd8b not found: ID does not exist" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.432631 4939 scope.go:117] "RemoveContainer" containerID="911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586" Mar 18 15:42:05 crc kubenswrapper[4939]: E0318 15:42:05.433082 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586\": container with ID starting with 911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586 not found: ID does not exist" containerID="911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.433122 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586"} err="failed to get container status \"911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586\": rpc error: code = NotFound desc = could not find container \"911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586\": container with ID starting with 911b8697d2db05e94f654c6cfae5247424ab7fd0ccac22045d7250ec07a46586 not found: ID does not exist" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.433151 4939 scope.go:117] "RemoveContainer" containerID="7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da" Mar 18 15:42:05 crc kubenswrapper[4939]: E0318 15:42:05.433472 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da\": container with ID starting with 7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da not found: ID does not exist" containerID="7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.433520 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da"} err="failed to get container status \"7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da\": rpc error: code = NotFound desc = could not find container \"7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da\": container with ID starting with 7ebab0a79f14dcf04c1cb56c5e0ed916c607b4d154b01c3b17af2fdb80d508da not found: ID does not exist" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.503582 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n275g\" (UniqueName: \"kubernetes.io/projected/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-kube-api-access-n275g\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.503657 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.503683 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c05afb5-9509-4e28-b0f8-ffe522fd31d3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.728836 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:42:05 crc kubenswrapper[4939]: I0318 15:42:05.733127 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kmh9v"] Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.145398 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" path="/var/lib/kubelet/pods/4c05afb5-9509-4e28-b0f8-ffe522fd31d3/volumes" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.383410 4939 generic.go:334] "Generic (PLEG): container finished" podID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerID="8a01a26becb6b6af79221d574cb0ecd40793b49474385bace7a345aa86baaf4e" exitCode=0 Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.383515 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerDied","Data":"8a01a26becb6b6af79221d574cb0ecd40793b49474385bace7a345aa86baaf4e"} Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.518705 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.621392 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content\") pod \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.621583 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqt9l\" (UniqueName: \"kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l\") pod \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.621736 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities\") pod \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\" (UID: \"c68ecdb5-45f3-4f17-8258-16c252e1cd7f\") " Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.622878 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities" (OuterVolumeSpecName: "utilities") pod "c68ecdb5-45f3-4f17-8258-16c252e1cd7f" (UID: "c68ecdb5-45f3-4f17-8258-16c252e1cd7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.623281 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.629840 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l" (OuterVolumeSpecName: "kube-api-access-nqt9l") pod "c68ecdb5-45f3-4f17-8258-16c252e1cd7f" (UID: "c68ecdb5-45f3-4f17-8258-16c252e1cd7f"). InnerVolumeSpecName "kube-api-access-nqt9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.725007 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqt9l\" (UniqueName: \"kubernetes.io/projected/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-kube-api-access-nqt9l\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.752761 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c68ecdb5-45f3-4f17-8258-16c252e1cd7f" (UID: "c68ecdb5-45f3-4f17-8258-16c252e1cd7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:42:06 crc kubenswrapper[4939]: I0318 15:42:06.826682 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68ecdb5-45f3-4f17-8258-16c252e1cd7f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.397355 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sxvx" event={"ID":"c68ecdb5-45f3-4f17-8258-16c252e1cd7f","Type":"ContainerDied","Data":"6daf1abbb9b04eea4701672ada3f243757a8526529510f1edca030b01217adf0"} Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.397439 4939 scope.go:117] "RemoveContainer" containerID="8a01a26becb6b6af79221d574cb0ecd40793b49474385bace7a345aa86baaf4e" Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.397533 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sxvx" Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.433456 4939 scope.go:117] "RemoveContainer" containerID="9878991fdcab666fcf7f68eb87090b92a271fd8539d67282d1342ebbbf5413f1" Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.455917 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.468793 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7sxvx"] Mar 18 15:42:07 crc kubenswrapper[4939]: I0318 15:42:07.473788 4939 scope.go:117] "RemoveContainer" containerID="2ebfff8168ac5efccf638f36a00ef0d03e2001455ccd58c56cdf095d7b153bb3" Mar 18 15:42:08 crc kubenswrapper[4939]: I0318 15:42:08.143725 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" path="/var/lib/kubelet/pods/c68ecdb5-45f3-4f17-8258-16c252e1cd7f/volumes" Mar 18 15:42:08 crc kubenswrapper[4939]: I0318 15:42:08.792051 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:42:09 crc kubenswrapper[4939]: I0318 15:42:09.412467 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:42:13 crc kubenswrapper[4939]: I0318 15:42:13.689624 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" podUID="604dd149-ccc8-492a-a624-fd3088ed3bab" containerName="oauth-openshift" containerID="cri-o://b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8" gracePeriod=15 Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.170613 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352731 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352800 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352929 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.352979 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353000 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpsc\" (UniqueName: \"kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353032 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353054 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353061 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353092 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353208 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353244 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353274 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection\") pod \"604dd149-ccc8-492a-a624-fd3088ed3bab\" (UID: \"604dd149-ccc8-492a-a624-fd3088ed3bab\") " Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353573 4939 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353667 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.353745 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.354164 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.354769 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.358659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.359025 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.359135 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.359595 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.361316 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.361330 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc" (OuterVolumeSpecName: "kube-api-access-2hpsc") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "kube-api-access-2hpsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.361988 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.362851 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.371058 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "604dd149-ccc8-492a-a624-fd3088ed3bab" (UID: "604dd149-ccc8-492a-a624-fd3088ed3bab"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.432367 4939 generic.go:334] "Generic (PLEG): container finished" podID="604dd149-ccc8-492a-a624-fd3088ed3bab" containerID="b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8" exitCode=0 Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.432397 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.432413 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" event={"ID":"604dd149-ccc8-492a-a624-fd3088ed3bab","Type":"ContainerDied","Data":"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8"} Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.432887 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9pp46" event={"ID":"604dd149-ccc8-492a-a624-fd3088ed3bab","Type":"ContainerDied","Data":"51a71d8d9fb1736539716d873b82aa5675de17794f330fd01e41bd19b93c373f"} Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.432923 4939 scope.go:117] "RemoveContainer" containerID="b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.459986 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460035 4939 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460058 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460080 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpsc\" (UniqueName: \"kubernetes.io/projected/604dd149-ccc8-492a-a624-fd3088ed3bab-kube-api-access-2hpsc\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460101 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460119 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460142 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460161 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460180 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460197 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460216 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460236 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.460255 4939 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/604dd149-ccc8-492a-a624-fd3088ed3bab-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.467615 4939 scope.go:117] "RemoveContainer" containerID="b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.467814 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:42:14 crc kubenswrapper[4939]: E0318 15:42:14.476155 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8\": container with ID starting with b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8 not found: ID does not exist" containerID="b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.476231 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8"} err="failed to get container status \"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8\": rpc error: code = NotFound desc = could not find container \"b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8\": container with ID starting with b58532f97a98f5c63aa0c68477b035dfac8969bc8529998b481aed00ffa2aaf8 not found: ID does not exist" Mar 18 15:42:14 crc kubenswrapper[4939]: I0318 15:42:14.488351 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9pp46"] Mar 18 15:42:15 crc kubenswrapper[4939]: I0318 15:42:15.840258 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69bfc54c76-l58c7"] Mar 18 15:42:15 crc kubenswrapper[4939]: I0318 15:42:15.841726 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" containerName="controller-manager" containerID="cri-o://8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49" gracePeriod=30 Mar 18 15:42:15 crc kubenswrapper[4939]: I0318 15:42:15.930732 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj"] Mar 18 15:42:15 crc kubenswrapper[4939]: I0318 15:42:15.930938 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" containerName="route-controller-manager" containerID="cri-o://a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb" gracePeriod=30 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.008765 4939 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009053 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009072 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009086 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="extract-content" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009094 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="extract-content" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009111 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="extract-content" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009118 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="extract-content" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009128 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604dd149-ccc8-492a-a624-fd3088ed3bab" containerName="oauth-openshift" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009135 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="604dd149-ccc8-492a-a624-fd3088ed3bab" containerName="oauth-openshift" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009148 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="extract-utilities" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009156 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="extract-utilities" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009167 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009175 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009185 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6573991b-28f7-4030-8a1b-734a3a8e37a7" containerName="oc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009193 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6573991b-28f7-4030-8a1b-734a3a8e37a7" containerName="oc" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.009207 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="extract-utilities" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009215 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="extract-utilities" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009372 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6573991b-28f7-4030-8a1b-734a3a8e37a7" containerName="oc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009385 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="604dd149-ccc8-492a-a624-fd3088ed3bab" containerName="oauth-openshift" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009397 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68ecdb5-45f3-4f17-8258-16c252e1cd7f" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009405 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c05afb5-9509-4e28-b0f8-ffe522fd31d3" containerName="registry-server" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009784 4939 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.009893 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010243 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1" gracePeriod=15 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010299 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6" gracePeriod=15 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010383 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04" gracePeriod=15 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010409 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0" gracePeriod=15 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010451 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae" gracePeriod=15 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010752 4939 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010889 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010904 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010912 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010918 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010927 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010933 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010940 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010946 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010957 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010962 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010971 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.010976 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.010986 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011013 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.011021 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011027 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011116 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011128 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011137 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011145 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011151 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011159 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011168 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011175 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.011289 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011300 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.011310 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011315 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.011402 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.078949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079025 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079054 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079103 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079162 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079196 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.079215 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.136678 4939 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.145965 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604dd149-ccc8-492a-a624-fd3088ed3bab" path="/var/lib/kubelet/pods/604dd149-ccc8-492a-a624-fd3088ed3bab/volumes" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.180405 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.180527 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.180945 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.180988 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181043 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181074 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181113 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181130 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181487 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181551 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181671 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181710 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181671 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181759 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.181779 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.382662 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.383282 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.385908 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.386321 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.386981 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.446901 4939 generic.go:334] "Generic (PLEG): container finished" podID="d98d7316-9a8c-47ad-870f-ff1dde95989c" containerID="a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.446952 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" event={"ID":"d98d7316-9a8c-47ad-870f-ff1dde95989c","Type":"ContainerDied","Data":"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb"} Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.446976 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" event={"ID":"d98d7316-9a8c-47ad-870f-ff1dde95989c","Type":"ContainerDied","Data":"1be28b75feff42da2bfb168fe30400fc61d4ea5e3f2f22bdd0b991262b2b6d9d"} Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.446995 4939 scope.go:117] "RemoveContainer" containerID="a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.447074 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.447787 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.448360 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.451761 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.453128 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.453895 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.453915 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.453923 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.453930 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0" exitCode=2 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.457290 4939 generic.go:334] "Generic (PLEG): container finished" podID="4414f1c6-6e94-4dc2-80df-af1f546ae085" containerID="4126abbef5d1087b8a884958abbfe2c215ce88d6be49fc3d3ffbdcd40553e98f" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.457348 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4414f1c6-6e94-4dc2-80df-af1f546ae085","Type":"ContainerDied","Data":"4126abbef5d1087b8a884958abbfe2c215ce88d6be49fc3d3ffbdcd40553e98f"} Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.457979 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458256 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458658 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458846 4939 generic.go:334] "Generic (PLEG): container finished" podID="517209f4-d504-46e5-98df-6b67bc2d6656" containerID="8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49" exitCode=0 Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458869 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" event={"ID":"517209f4-d504-46e5-98df-6b67bc2d6656","Type":"ContainerDied","Data":"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49"} Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458905 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" event={"ID":"517209f4-d504-46e5-98df-6b67bc2d6656","Type":"ContainerDied","Data":"6c91baef83ee4ea507662546223aea078fb5bcd6a0470f4c716a45438f3878af"} Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.458908 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.459368 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.459736 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.460064 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.469750 4939 scope.go:117] "RemoveContainer" containerID="a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.470109 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb\": container with ID starting with a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb not found: ID does not exist" containerID="a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.470182 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb"} err="failed to get container status \"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb\": rpc error: code = NotFound desc = could not find container \"a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb\": container with ID starting with a4811d967501f14651364126e7b17e9e4012c173fe3e4cbb2dd48a1b1053e4cb not found: ID does not exist" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.470230 4939 scope.go:117] "RemoveContainer" containerID="2b2984322f2a996d5be2db7edd84294efd2f5bc229f90afd455450170dc2c550" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484140 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config\") pod \"d98d7316-9a8c-47ad-870f-ff1dde95989c\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484197 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca\") pod \"517209f4-d504-46e5-98df-6b67bc2d6656\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484218 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config\") pod \"517209f4-d504-46e5-98df-6b67bc2d6656\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484257 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert\") pod \"517209f4-d504-46e5-98df-6b67bc2d6656\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484279 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vjx\" (UniqueName: \"kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx\") pod \"517209f4-d504-46e5-98df-6b67bc2d6656\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484306 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert\") pod \"d98d7316-9a8c-47ad-870f-ff1dde95989c\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484328 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca\") pod \"d98d7316-9a8c-47ad-870f-ff1dde95989c\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.484344 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dnfk\" (UniqueName: \"kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk\") pod \"d98d7316-9a8c-47ad-870f-ff1dde95989c\" (UID: \"d98d7316-9a8c-47ad-870f-ff1dde95989c\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.485015 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config" (OuterVolumeSpecName: "config") pod "d98d7316-9a8c-47ad-870f-ff1dde95989c" (UID: "d98d7316-9a8c-47ad-870f-ff1dde95989c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.485974 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca" (OuterVolumeSpecName: "client-ca") pod "517209f4-d504-46e5-98df-6b67bc2d6656" (UID: "517209f4-d504-46e5-98df-6b67bc2d6656"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.486002 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d98d7316-9a8c-47ad-870f-ff1dde95989c" (UID: "d98d7316-9a8c-47ad-870f-ff1dde95989c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.486578 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config" (OuterVolumeSpecName: "config") pod "517209f4-d504-46e5-98df-6b67bc2d6656" (UID: "517209f4-d504-46e5-98df-6b67bc2d6656"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.489271 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d98d7316-9a8c-47ad-870f-ff1dde95989c" (UID: "d98d7316-9a8c-47ad-870f-ff1dde95989c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.489395 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx" (OuterVolumeSpecName: "kube-api-access-54vjx") pod "517209f4-d504-46e5-98df-6b67bc2d6656" (UID: "517209f4-d504-46e5-98df-6b67bc2d6656"). InnerVolumeSpecName "kube-api-access-54vjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.490772 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "517209f4-d504-46e5-98df-6b67bc2d6656" (UID: "517209f4-d504-46e5-98df-6b67bc2d6656"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.490800 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk" (OuterVolumeSpecName: "kube-api-access-9dnfk") pod "d98d7316-9a8c-47ad-870f-ff1dde95989c" (UID: "d98d7316-9a8c-47ad-870f-ff1dde95989c"). InnerVolumeSpecName "kube-api-access-9dnfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.501205 4939 scope.go:117] "RemoveContainer" containerID="8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.520586 4939 scope.go:117] "RemoveContainer" containerID="8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49" Mar 18 15:42:16 crc kubenswrapper[4939]: E0318 15:42:16.521469 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49\": container with ID starting with 8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49 not found: ID does not exist" containerID="8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.521529 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49"} err="failed to get container status \"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49\": rpc error: code = NotFound desc = could not find container \"8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49\": container with ID starting with 8205c3115523fb1c3f3793517238a4b6e715ecdd25cf4bb5e076634ed7278e49 not found: ID does not exist" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.585705 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles\") pod \"517209f4-d504-46e5-98df-6b67bc2d6656\" (UID: \"517209f4-d504-46e5-98df-6b67bc2d6656\") " Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586364 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "517209f4-d504-46e5-98df-6b67bc2d6656" (UID: "517209f4-d504-46e5-98df-6b67bc2d6656"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586471 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/517209f4-d504-46e5-98df-6b67bc2d6656-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586537 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vjx\" (UniqueName: \"kubernetes.io/projected/517209f4-d504-46e5-98df-6b67bc2d6656-kube-api-access-54vjx\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586557 4939 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d98d7316-9a8c-47ad-870f-ff1dde95989c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586577 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586594 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dnfk\" (UniqueName: \"kubernetes.io/projected/d98d7316-9a8c-47ad-870f-ff1dde95989c-kube-api-access-9dnfk\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586613 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98d7316-9a8c-47ad-870f-ff1dde95989c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586633 4939 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.586651 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.687497 4939 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/517209f4-d504-46e5-98df-6b67bc2d6656-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.767746 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.768234 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.768721 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.789404 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.789800 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:16 crc kubenswrapper[4939]: I0318 15:42:16.790387 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.469227 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.709196 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.709887 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.710327 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.711004 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.815055 4939 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.816023 4939 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.816802 4939 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.817365 4939 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.817828 4939 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.817872 4939 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 15:42:17 crc kubenswrapper[4939]: E0318 15:42:17.818215 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="200ms" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903016 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock\") pod \"4414f1c6-6e94-4dc2-80df-af1f546ae085\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903099 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir\") pod \"4414f1c6-6e94-4dc2-80df-af1f546ae085\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903146 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access\") pod \"4414f1c6-6e94-4dc2-80df-af1f546ae085\" (UID: \"4414f1c6-6e94-4dc2-80df-af1f546ae085\") " Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903164 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock" (OuterVolumeSpecName: "var-lock") pod "4414f1c6-6e94-4dc2-80df-af1f546ae085" (UID: "4414f1c6-6e94-4dc2-80df-af1f546ae085"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903255 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4414f1c6-6e94-4dc2-80df-af1f546ae085" (UID: "4414f1c6-6e94-4dc2-80df-af1f546ae085"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903601 4939 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.903623 4939 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4414f1c6-6e94-4dc2-80df-af1f546ae085-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:17 crc kubenswrapper[4939]: I0318 15:42:17.908204 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4414f1c6-6e94-4dc2-80df-af1f546ae085" (UID: "4414f1c6-6e94-4dc2-80df-af1f546ae085"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.004739 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4414f1c6-6e94-4dc2-80df-af1f546ae085-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.019724 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="400ms" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.355272 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.356663 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.357676 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.357942 4939 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.358205 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.358561 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.420944 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="800ms" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.478670 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.478684 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4414f1c6-6e94-4dc2-80df-af1f546ae085","Type":"ContainerDied","Data":"cb5ada438014ebae8bd290c6edcf171ae7a6a7aaed5b71b531e8cfe530e7ac34"} Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.478731 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb5ada438014ebae8bd290c6edcf171ae7a6a7aaed5b71b531e8cfe530e7ac34" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.483881 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.485038 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1" exitCode=0 Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.485074 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.485107 4939 scope.go:117] "RemoveContainer" containerID="79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.485290 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.485438 4939 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.486103 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.486500 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509096 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509180 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509224 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509382 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509401 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509447 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509706 4939 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509733 4939 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.509750 4939 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.510708 4939 scope.go:117] "RemoveContainer" containerID="11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.528470 4939 scope.go:117] "RemoveContainer" containerID="299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.549781 4939 scope.go:117] "RemoveContainer" containerID="85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.569206 4939 scope.go:117] "RemoveContainer" containerID="da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.589143 4939 scope.go:117] "RemoveContainer" containerID="53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.616403 4939 scope.go:117] "RemoveContainer" containerID="79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.617026 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\": container with ID starting with 79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6 not found: ID does not exist" containerID="79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.617085 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6"} err="failed to get container status \"79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\": rpc error: code = NotFound desc = could not find container \"79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6\": container with ID starting with 79a4ba91ee4a18115e29c793bd85ff3c63617ce8045ae8474a2005061adbe6f6 not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.617123 4939 scope.go:117] "RemoveContainer" containerID="11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.617909 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\": container with ID starting with 11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04 not found: ID does not exist" containerID="11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.617956 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04"} err="failed to get container status \"11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\": rpc error: code = NotFound desc = could not find container \"11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04\": container with ID starting with 11fb7b9f433d9b1ec3e3901414493ccf72a301aafdae7087a856ec623d6a0f04 not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.617990 4939 scope.go:117] "RemoveContainer" containerID="299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.618391 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\": container with ID starting with 299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae not found: ID does not exist" containerID="299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.618555 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae"} err="failed to get container status \"299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\": rpc error: code = NotFound desc = could not find container \"299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae\": container with ID starting with 299ef494767ea384fe8154b4a66377200da7a6822d21f0fb2db8424eb7113cae not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.618589 4939 scope.go:117] "RemoveContainer" containerID="85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.619174 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\": container with ID starting with 85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0 not found: ID does not exist" containerID="85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.619245 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0"} err="failed to get container status \"85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\": rpc error: code = NotFound desc = could not find container \"85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0\": container with ID starting with 85f67b922ba526f1f32ff4e1088b5f4bbb0a1a6778e37c5753ecc413961825f0 not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.619298 4939 scope.go:117] "RemoveContainer" containerID="da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.619683 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\": container with ID starting with da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1 not found: ID does not exist" containerID="da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.619711 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1"} err="failed to get container status \"da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\": rpc error: code = NotFound desc = could not find container \"da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1\": container with ID starting with da61aadd9d5765311d7925e22819ef64f45d52a141d3c3d6797ea650df2bccc1 not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.619727 4939 scope.go:117] "RemoveContainer" containerID="53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.620039 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\": container with ID starting with 53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44 not found: ID does not exist" containerID="53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.620066 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44"} err="failed to get container status \"53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\": rpc error: code = NotFound desc = could not find container \"53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44\": container with ID starting with 53a3f1704c0da9c46823cefb7b3db98d50b2a8e096f6648ed680d0c9ae1d3d44 not found: ID does not exist" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.814496 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.815753 4939 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.816560 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: I0318 15:42:18.816939 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.888067 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:42:18Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:42:18Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:42:18Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:42:18Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.888828 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.889310 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.889930 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.890647 4939 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:18 crc kubenswrapper[4939]: E0318 15:42:18.890680 4939 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:42:19 crc kubenswrapper[4939]: E0318 15:42:19.221630 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="1.6s" Mar 18 15:42:20 crc kubenswrapper[4939]: I0318 15:42:20.139194 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 15:42:20 crc kubenswrapper[4939]: E0318 15:42:20.823170 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="3.2s" Mar 18 15:42:21 crc kubenswrapper[4939]: E0318 15:42:21.070912 4939 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.071341 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:21 crc kubenswrapper[4939]: W0318 15:42:21.092016 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4ca0d337be950f54e0ec3b54c07ece9b51908dd30d1e204be82c4be3933ca658 WatchSource:0}: Error finding container 4ca0d337be950f54e0ec3b54c07ece9b51908dd30d1e204be82c4be3933ca658: Status 404 returned error can't find the container with id 4ca0d337be950f54e0ec3b54c07ece9b51908dd30d1e204be82c4be3933ca658 Mar 18 15:42:21 crc kubenswrapper[4939]: E0318 15:42:21.095635 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df9d9914e9696 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:42:21.09490959 +0000 UTC m=+305.694097211,LastTimestamp:2026-03-18 15:42:21.09490959 +0000 UTC m=+305.694097211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.505653 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff"} Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.505715 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4ca0d337be950f54e0ec3b54c07ece9b51908dd30d1e204be82c4be3933ca658"} Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.506434 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:21 crc kubenswrapper[4939]: E0318 15:42:21.506446 4939 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.506967 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:21 crc kubenswrapper[4939]: I0318 15:42:21.507889 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:23 crc kubenswrapper[4939]: E0318 15:42:23.174345 4939 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" volumeName="registry-storage" Mar 18 15:42:24 crc kubenswrapper[4939]: E0318 15:42:24.024626 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="6.4s" Mar 18 15:42:26 crc kubenswrapper[4939]: I0318 15:42:26.137405 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:26 crc kubenswrapper[4939]: I0318 15:42:26.138339 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:26 crc kubenswrapper[4939]: I0318 15:42:26.138758 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.558008 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.559056 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.559116 4939 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856" exitCode=1 Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.559157 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856"} Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.559728 4939 scope.go:117] "RemoveContainer" containerID="1f2d1bdb26badf8b9f8c6487a48238fde933411416bc9459d6ad9d1b9fa11856" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.559948 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.560402 4939 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.561042 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:28 crc kubenswrapper[4939]: I0318 15:42:28.561324 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:29 crc kubenswrapper[4939]: E0318 15:42:29.004537 4939 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df9d9914e9696 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:42:21.09490959 +0000 UTC m=+305.694097211,LastTimestamp:2026-03-18 15:42:21.09490959 +0000 UTC m=+305.694097211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.510817 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.574715 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.575213 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.575257 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"648d3dd230b7a9bed15c8f87a10152e9b83f79fa5875f0747246027c0d5c5dc3"} Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.576261 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.576670 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.576987 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:29 crc kubenswrapper[4939]: I0318 15:42:29.577358 4939 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.132926 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.134376 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.135248 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.136050 4939 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.136634 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.155667 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.155723 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:30 crc kubenswrapper[4939]: E0318 15:42:30.156247 4939 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.156915 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:30 crc kubenswrapper[4939]: W0318 15:42:30.189091 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-52deefde273dcf6fe5c67cd6f8f1c0dba7c0b3419c80ac00f465298dc23d05c0 WatchSource:0}: Error finding container 52deefde273dcf6fe5c67cd6f8f1c0dba7c0b3419c80ac00f465298dc23d05c0: Status 404 returned error can't find the container with id 52deefde273dcf6fe5c67cd6f8f1c0dba7c0b3419c80ac00f465298dc23d05c0 Mar 18 15:42:30 crc kubenswrapper[4939]: E0318 15:42:30.425894 4939 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="7s" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.584360 4939 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8eb4b7c779935ffb99a4d761abd3849af4ce66a215e4555fe95ef08d006a1918" exitCode=0 Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.584425 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8eb4b7c779935ffb99a4d761abd3849af4ce66a215e4555fe95ef08d006a1918"} Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.584550 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"52deefde273dcf6fe5c67cd6f8f1c0dba7c0b3419c80ac00f465298dc23d05c0"} Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.585166 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.585200 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.585878 4939 status_manager.go:851] "Failed to get status for pod" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: E0318 15:42:30.585886 4939 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.586287 4939 status_manager.go:851] "Failed to get status for pod" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" pod="openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-59b5c4d47f-4qcwj\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.586704 4939 status_manager.go:851] "Failed to get status for pod" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" pod="openshift-controller-manager/controller-manager-69bfc54c76-l58c7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-69bfc54c76-l58c7\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:30 crc kubenswrapper[4939]: I0318 15:42:30.587174 4939 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Mar 18 15:42:31 crc kubenswrapper[4939]: I0318 15:42:31.597308 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"712fa6b7dd38711951b5f64a5ef78281ef99e091f721f17923265dcff96dcdad"} Mar 18 15:42:31 crc kubenswrapper[4939]: I0318 15:42:31.597644 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"99a9ce1202ba876928df08f9035a9d010e4676e7694d3f69ea1c619a3afa593e"} Mar 18 15:42:31 crc kubenswrapper[4939]: I0318 15:42:31.597660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f5bae0968a6dcb9e077eedf1f547dcb73cf61c7299898966470f3779d165c05"} Mar 18 15:42:31 crc kubenswrapper[4939]: I0318 15:42:31.597673 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"82cbc67f12a46782fad1f4b2bb17eb709a7637617223cd35317bff36b3e2ff61"} Mar 18 15:42:32 crc kubenswrapper[4939]: I0318 15:42:32.606313 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e34a36a450679aa5597b66ce171ac796b6d5117281ea1043c59e2992cd34680a"} Mar 18 15:42:32 crc kubenswrapper[4939]: I0318 15:42:32.606811 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:32 crc kubenswrapper[4939]: I0318 15:42:32.606849 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:35 crc kubenswrapper[4939]: I0318 15:42:35.157452 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:35 crc kubenswrapper[4939]: I0318 15:42:35.158689 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:35 crc kubenswrapper[4939]: I0318 15:42:35.168686 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.617763 4939 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.647293 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.647390 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.647407 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.651606 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.756356 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9dce5dbc-3273-440e-b42d-bc301d2e94e6" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.827270 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:42:37 crc kubenswrapper[4939]: I0318 15:42:37.831829 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:42:38 crc kubenswrapper[4939]: I0318 15:42:38.651710 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:38 crc kubenswrapper[4939]: I0318 15:42:38.651743 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:38 crc kubenswrapper[4939]: I0318 15:42:38.651797 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:42:38 crc kubenswrapper[4939]: I0318 15:42:38.655732 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9dce5dbc-3273-440e-b42d-bc301d2e94e6" Mar 18 15:42:38 crc kubenswrapper[4939]: I0318 15:42:38.655852 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:42:39 crc kubenswrapper[4939]: I0318 15:42:39.665999 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:39 crc kubenswrapper[4939]: I0318 15:42:39.666049 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:39 crc kubenswrapper[4939]: I0318 15:42:39.669711 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9dce5dbc-3273-440e-b42d-bc301d2e94e6" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.280550 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.280982 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.281185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.281316 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.283429 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.283824 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.284424 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.293346 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.293985 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.300182 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.307592 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.308430 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.463141 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.470675 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:42:42 crc kubenswrapper[4939]: I0318 15:42:42.482423 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:42:42 crc kubenswrapper[4939]: W0318 15:42:42.937066 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2220ca7f7d85b17aa8f5cde24c94e96a0a90feddb7352f67289b2f9b5f5aa26e WatchSource:0}: Error finding container 2220ca7f7d85b17aa8f5cde24c94e96a0a90feddb7352f67289b2f9b5f5aa26e: Status 404 returned error can't find the container with id 2220ca7f7d85b17aa8f5cde24c94e96a0a90feddb7352f67289b2f9b5f5aa26e Mar 18 15:42:42 crc kubenswrapper[4939]: W0318 15:42:42.969716 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-184d603cef7a21a47021263f78bad912e21afc1b308b70f5d54ae0619312f42d WatchSource:0}: Error finding container 184d603cef7a21a47021263f78bad912e21afc1b308b70f5d54ae0619312f42d: Status 404 returned error can't find the container with id 184d603cef7a21a47021263f78bad912e21afc1b308b70f5d54ae0619312f42d Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.692493 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6016dd145abfbe69ade0d51985969b876c10dec5ab48bc5d589a1b87ae805f4d"} Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.693119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"184d603cef7a21a47021263f78bad912e21afc1b308b70f5d54ae0619312f42d"} Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.694704 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1249eb0985893004ff4f726c5d83333c28871aa2b0e0c60d14573f0268f0ff51"} Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.694783 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7b6c5e608cc62d738982e204dbd9573ab9ffce81d6376ed3e0e755eb365dbc9e"} Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.694980 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.696996 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"60f2740d996cb1cedd0aa13380c7b9ec8b5778423098b406d6d01dc43a286540"} Mar 18 15:42:43 crc kubenswrapper[4939]: I0318 15:42:43.697046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2220ca7f7d85b17aa8f5cde24c94e96a0a90feddb7352f67289b2f9b5f5aa26e"} Mar 18 15:42:44 crc kubenswrapper[4939]: I0318 15:42:44.703188 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 15:42:44 crc kubenswrapper[4939]: I0318 15:42:44.703660 4939 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6016dd145abfbe69ade0d51985969b876c10dec5ab48bc5d589a1b87ae805f4d" exitCode=255 Mar 18 15:42:44 crc kubenswrapper[4939]: I0318 15:42:44.703748 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6016dd145abfbe69ade0d51985969b876c10dec5ab48bc5d589a1b87ae805f4d"} Mar 18 15:42:44 crc kubenswrapper[4939]: I0318 15:42:44.704173 4939 scope.go:117] "RemoveContainer" containerID="6016dd145abfbe69ade0d51985969b876c10dec5ab48bc5d589a1b87ae805f4d" Mar 18 15:42:45 crc kubenswrapper[4939]: I0318 15:42:45.712108 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 15:42:45 crc kubenswrapper[4939]: I0318 15:42:45.712185 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a"} Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.718977 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.719351 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.719384 4939 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a" exitCode=255 Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.719409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a"} Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.719439 4939 scope.go:117] "RemoveContainer" containerID="6016dd145abfbe69ade0d51985969b876c10dec5ab48bc5d589a1b87ae805f4d" Mar 18 15:42:46 crc kubenswrapper[4939]: I0318 15:42:46.719944 4939 scope.go:117] "RemoveContainer" containerID="53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a" Mar 18 15:42:46 crc kubenswrapper[4939]: E0318 15:42:46.720185 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:42:47 crc kubenswrapper[4939]: I0318 15:42:47.627118 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:42:47 crc kubenswrapper[4939]: I0318 15:42:47.684703 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:42:47 crc kubenswrapper[4939]: I0318 15:42:47.727789 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.080590 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.087784 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.189599 4939 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.238676 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.377900 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.515326 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:42:48 crc kubenswrapper[4939]: I0318 15:42:48.553998 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.105760 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.450002 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.563492 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.641566 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.744433 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.745492 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:42:49 crc kubenswrapper[4939]: I0318 15:42:49.797263 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.036532 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.043937 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.161474 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.189596 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.328761 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.455885 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.472697 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.488205 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.563676 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.610105 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.637470 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:42:50 crc kubenswrapper[4939]: I0318 15:42:50.946347 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.196134 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.227136 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.398129 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.408430 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.466654 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.503496 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.543166 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.779701 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.883023 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.888551 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.894207 4939 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899301 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69bfc54c76-l58c7","openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-59b5c4d47f-4qcwj"] Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899375 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-ccc74cc7-48fs8"] Mar 18 15:42:51 crc kubenswrapper[4939]: E0318 15:42:51.899596 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" containerName="installer" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899619 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" containerName="installer" Mar 18 15:42:51 crc kubenswrapper[4939]: E0318 15:42:51.899638 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" containerName="route-controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899650 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" containerName="route-controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: E0318 15:42:51.899670 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" containerName="controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899678 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" containerName="controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899789 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" containerName="route-controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899806 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4414f1c6-6e94-4dc2-80df-af1f546ae085" containerName="installer" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899818 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" containerName="controller-manager" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899844 4939 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.899867 4939 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ab41abb6-5f2e-4c42-a3a2-9e24834d5298" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.900329 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.902637 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfpg\" (UniqueName: \"kubernetes.io/projected/78b62a89-1cb9-4ce8-ab83-7bb362a98783-kube-api-access-ggfpg\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.902739 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-dir\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.902779 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.902845 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.902935 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903002 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903071 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903140 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903190 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903242 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903298 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903360 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-policies\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.903485 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.909280 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.909635 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.909799 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.910831 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.911766 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.911848 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.912036 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.911780 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.912303 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.911813 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.912447 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.912569 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.914495 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.925921 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.931588 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.935476 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:42:51 crc kubenswrapper[4939]: I0318 15:42:51.948717 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.948694033 podStartE2EDuration="14.948694033s" podCreationTimestamp="2026-03-18 15:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:42:51.946991294 +0000 UTC m=+336.546178955" watchObservedRunningTime="2026-03-18 15:42:51.948694033 +0000 UTC m=+336.547881654" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004253 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-dir\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004697 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004728 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004448 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-dir\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004766 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.004946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005035 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005119 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005162 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005229 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005349 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005387 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005433 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-policies\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005475 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.005562 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfpg\" (UniqueName: \"kubernetes.io/projected/78b62a89-1cb9-4ce8-ab83-7bb362a98783-kube-api-access-ggfpg\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.006299 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-cliconfig\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.006461 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.012353 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.012582 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-service-ca\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.012762 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.012849 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.012969 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-error\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.013171 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78b62a89-1cb9-4ce8-ab83-7bb362a98783-audit-policies\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.013223 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-session\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.016120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-serving-cert\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.016813 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-system-router-certs\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.021063 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78b62a89-1cb9-4ce8-ab83-7bb362a98783-v4-0-config-user-template-login\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.042304 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfpg\" (UniqueName: \"kubernetes.io/projected/78b62a89-1cb9-4ce8-ab83-7bb362a98783-kube-api-access-ggfpg\") pod \"oauth-openshift-ccc74cc7-48fs8\" (UID: \"78b62a89-1cb9-4ce8-ab83-7bb362a98783\") " pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.066637 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.145874 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517209f4-d504-46e5-98df-6b67bc2d6656" path="/var/lib/kubelet/pods/517209f4-d504-46e5-98df-6b67bc2d6656/volumes" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.147820 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98d7316-9a8c-47ad-870f-ff1dde95989c" path="/var/lib/kubelet/pods/d98d7316-9a8c-47ad-870f-ff1dde95989c/volumes" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.236460 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.252692 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.300836 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.365877 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.378712 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.391181 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.403964 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.414076 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.423400 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.481640 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.482043 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.525104 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.583790 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.596703 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.631266 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.649905 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-ccc74cc7-48fs8"] Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.680358 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.765526 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.800764 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.813430 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.962460 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:42:52 crc kubenswrapper[4939]: I0318 15:42:52.988909 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.032996 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.056815 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.075698 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.156096 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.190929 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.309562 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.492833 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.596095 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.720210 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:42:53 crc kubenswrapper[4939]: I0318 15:42:53.728427 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.002051 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.014425 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.020463 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.103909 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.130775 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.155880 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.199347 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.252421 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.399037 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.411427 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.536573 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.671787 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.672349 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.674407 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.706916 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.849341 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.904134 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.933417 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:42:54 crc kubenswrapper[4939]: I0318 15:42:54.970375 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.041425 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.084710 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.123803 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.212593 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.278437 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.286941 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.329238 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.363665 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.493532 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:42:55 crc kubenswrapper[4939]: E0318 15:42:55.495562 4939 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 15:42:55 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d" Netns:"/var/run/netns/043a0a81-d6f5-4cf9-ba14-f1460506dc72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:55 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:55 crc kubenswrapper[4939]: > Mar 18 15:42:55 crc kubenswrapper[4939]: E0318 15:42:55.495641 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 15:42:55 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d" Netns:"/var/run/netns/043a0a81-d6f5-4cf9-ba14-f1460506dc72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:55 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:55 crc kubenswrapper[4939]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:55 crc kubenswrapper[4939]: E0318 15:42:55.495663 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 15:42:55 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d" Netns:"/var/run/netns/043a0a81-d6f5-4cf9-ba14-f1460506dc72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:55 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:55 crc kubenswrapper[4939]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:55 crc kubenswrapper[4939]: E0318 15:42:55.495715 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-ccc74cc7-48fs8_openshift-authentication(78b62a89-1cb9-4ce8-ab83-7bb362a98783)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-ccc74cc7-48fs8_openshift-authentication(78b62a89-1cb9-4ce8-ab83-7bb362a98783)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d\\\" Netns:\\\"/var/run/netns/043a0a81-d6f5-4cf9-ba14-f1460506dc72\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=26c8e8d9407012fac0f9308fee44e00044981e0756b56552986a6c62136b635d;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod \\\"oauth-openshift-ccc74cc7-48fs8\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" podUID="78b62a89-1cb9-4ce8-ab83-7bb362a98783" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.575377 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.708763 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.770700 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.771408 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.796640 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.896609 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.918116 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.924381 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:42:55 crc kubenswrapper[4939]: I0318 15:42:55.925615 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.157375 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.230842 4939 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.310433 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.316472 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.332297 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.375582 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.465165 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.541553 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.638853 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.728166 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.734010 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.786873 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.791965 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77d77644d-fthkq"] Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.792912 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.796161 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.796652 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.796772 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.796892 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.797083 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.799064 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b"] Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.799811 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.804913 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.805367 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.805717 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.805926 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.806424 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.806668 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.806883 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.806883 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.811293 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.812061 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d77644d-fthkq"] Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.821704 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b"] Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.832144 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.833245 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad671df-e651-45de-99a6-49c0f62a6891-serving-cert\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.833339 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-config\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.833424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-proxy-ca-bundles\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.833477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-client-ca\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.833536 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqlx\" (UniqueName: \"kubernetes.io/projected/2ad671df-e651-45de-99a6-49c0f62a6891-kube-api-access-mvqlx\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.867134 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.925067 4939 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935032 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad671df-e651-45de-99a6-49c0f62a6891-serving-cert\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935108 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-client-ca\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935144 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7j7\" (UniqueName: \"kubernetes.io/projected/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-kube-api-access-vw7j7\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935310 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-config\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935378 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-proxy-ca-bundles\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935442 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-serving-cert\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935479 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-client-ca\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935557 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqlx\" (UniqueName: \"kubernetes.io/projected/2ad671df-e651-45de-99a6-49c0f62a6891-kube-api-access-mvqlx\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.935601 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-config\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.936615 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-proxy-ca-bundles\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.936684 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-client-ca\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.937123 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad671df-e651-45de-99a6-49c0f62a6891-config\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.952467 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad671df-e651-45de-99a6-49c0f62a6891-serving-cert\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:56 crc kubenswrapper[4939]: I0318 15:42:56.955008 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqlx\" (UniqueName: \"kubernetes.io/projected/2ad671df-e651-45de-99a6-49c0f62a6891-kube-api-access-mvqlx\") pod \"controller-manager-77d77644d-fthkq\" (UID: \"2ad671df-e651-45de-99a6-49c0f62a6891\") " pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.036538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-client-ca\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.036599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7j7\" (UniqueName: \"kubernetes.io/projected/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-kube-api-access-vw7j7\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.036650 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-serving-cert\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.036685 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-config\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.038068 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-config\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.038241 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-client-ca\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.041242 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-serving-cert\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.056702 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7j7\" (UniqueName: \"kubernetes.io/projected/b474ab64-fb2c-4a3c-be42-0a2c888e60e5-kube-api-access-vw7j7\") pod \"route-controller-manager-57f7fc8d4c-jsn6b\" (UID: \"b474ab64-fb2c-4a3c-be42-0a2c888e60e5\") " pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.077737 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.077749 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.088423 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.117811 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.137930 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.179219 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.301611 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.320659 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.412306 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.472805 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.596042 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.626379 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.707578 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.779007 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.831535 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.846692 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.848722 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.914672 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.950439 4939 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.990176 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:42:57 crc kubenswrapper[4939]: I0318 15:42:57.994207 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.133098 4939 scope.go:117] "RemoveContainer" containerID="53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.195381 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.208923 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.240167 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.252981 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.326831 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.365467 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.442076 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.548910 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.552400 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.613607 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.633169 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.753905 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.795918 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.795980 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87"} Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.817248 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.829084 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.883382 4939 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.956913 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:42:58 crc kubenswrapper[4939]: E0318 15:42:58.977796 4939 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 15:42:58 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392" Netns:"/var/run/netns/fa63d774-1797-428d-a7ad-677fa6c0f1d3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:58 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:58 crc kubenswrapper[4939]: > Mar 18 15:42:58 crc kubenswrapper[4939]: E0318 15:42:58.977908 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 15:42:58 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392" Netns:"/var/run/netns/fa63d774-1797-428d-a7ad-677fa6c0f1d3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:58 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:58 crc kubenswrapper[4939]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:58 crc kubenswrapper[4939]: E0318 15:42:58.977945 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 15:42:58 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392" Netns:"/var/run/netns/fa63d774-1797-428d-a7ad-677fa6c0f1d3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod "oauth-openshift-ccc74cc7-48fs8" not found Mar 18 15:42:58 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:42:58 crc kubenswrapper[4939]: > pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:42:58 crc kubenswrapper[4939]: E0318 15:42:58.978036 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-ccc74cc7-48fs8_openshift-authentication(78b62a89-1cb9-4ce8-ab83-7bb362a98783)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-ccc74cc7-48fs8_openshift-authentication(78b62a89-1cb9-4ce8-ab83-7bb362a98783)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-ccc74cc7-48fs8_openshift-authentication_78b62a89-1cb9-4ce8-ab83-7bb362a98783_0(bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392): error adding pod openshift-authentication_oauth-openshift-ccc74cc7-48fs8 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392\\\" Netns:\\\"/var/run/netns/fa63d774-1797-428d-a7ad-677fa6c0f1d3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-ccc74cc7-48fs8;K8S_POD_INFRA_CONTAINER_ID=bde1f5c9634e70f8f90a5a703364d937f792888cfb74b2902f57e5cdbde7a392;K8S_POD_UID=78b62a89-1cb9-4ce8-ab83-7bb362a98783\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-ccc74cc7-48fs8] networking: Multus: [openshift-authentication/oauth-openshift-ccc74cc7-48fs8/78b62a89-1cb9-4ce8-ab83-7bb362a98783]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-ccc74cc7-48fs8 in out of cluster comm: pod \\\"oauth-openshift-ccc74cc7-48fs8\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" podUID="78b62a89-1cb9-4ce8-ab83-7bb362a98783" Mar 18 15:42:58 crc kubenswrapper[4939]: I0318 15:42:58.994462 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.000032 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.244636 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.301753 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.308622 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.342327 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.364250 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.452579 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.479768 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.535918 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.705936 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.708816 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.760839 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.804223 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.805021 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.805071 4939 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87" exitCode=255 Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.805105 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87"} Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.805146 4939 scope.go:117] "RemoveContainer" containerID="53073a5ab1caf7c22cc34c18257ea1a1aed644cd10206ddef9d1ee677863b14a" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.805697 4939 scope.go:117] "RemoveContainer" containerID="f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87" Mar 18 15:42:59 crc kubenswrapper[4939]: E0318 15:42:59.806073 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.876606 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.887037 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.941058 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.941309 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.956180 4939 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:42:59 crc kubenswrapper[4939]: I0318 15:42:59.956467 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff" gracePeriod=5 Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.017252 4939 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-77d77644d-fthkq_openshift-controller-manager_2ad671df-e651-45de-99a6-49c0f62a6891_0(0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa): error adding pod openshift-controller-manager_controller-manager-77d77644d-fthkq to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa" Netns:"/var/run/netns/79ecb181-da14-4b06-8eee-cd1d2284cc90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-77d77644d-fthkq;K8S_POD_INFRA_CONTAINER_ID=0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa;K8S_POD_UID=2ad671df-e651-45de-99a6-49c0f62a6891" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-77d77644d-fthkq] networking: Multus: [openshift-controller-manager/controller-manager-77d77644d-fthkq/2ad671df-e651-45de-99a6-49c0f62a6891]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-77d77644d-fthkq in out of cluster comm: pod "controller-manager-77d77644d-fthkq" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.017316 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-77d77644d-fthkq_openshift-controller-manager_2ad671df-e651-45de-99a6-49c0f62a6891_0(0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa): error adding pod openshift-controller-manager_controller-manager-77d77644d-fthkq to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa" Netns:"/var/run/netns/79ecb181-da14-4b06-8eee-cd1d2284cc90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-77d77644d-fthkq;K8S_POD_INFRA_CONTAINER_ID=0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa;K8S_POD_UID=2ad671df-e651-45de-99a6-49c0f62a6891" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-77d77644d-fthkq] networking: Multus: [openshift-controller-manager/controller-manager-77d77644d-fthkq/2ad671df-e651-45de-99a6-49c0f62a6891]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-77d77644d-fthkq in out of cluster comm: pod "controller-manager-77d77644d-fthkq" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.017336 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-77d77644d-fthkq_openshift-controller-manager_2ad671df-e651-45de-99a6-49c0f62a6891_0(0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa): error adding pod openshift-controller-manager_controller-manager-77d77644d-fthkq to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa" Netns:"/var/run/netns/79ecb181-da14-4b06-8eee-cd1d2284cc90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-77d77644d-fthkq;K8S_POD_INFRA_CONTAINER_ID=0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa;K8S_POD_UID=2ad671df-e651-45de-99a6-49c0f62a6891" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-77d77644d-fthkq] networking: Multus: [openshift-controller-manager/controller-manager-77d77644d-fthkq/2ad671df-e651-45de-99a6-49c0f62a6891]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-77d77644d-fthkq in out of cluster comm: pod "controller-manager-77d77644d-fthkq" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.017387 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-77d77644d-fthkq_openshift-controller-manager(2ad671df-e651-45de-99a6-49c0f62a6891)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-77d77644d-fthkq_openshift-controller-manager(2ad671df-e651-45de-99a6-49c0f62a6891)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-77d77644d-fthkq_openshift-controller-manager_2ad671df-e651-45de-99a6-49c0f62a6891_0(0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa): error adding pod openshift-controller-manager_controller-manager-77d77644d-fthkq to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa\\\" Netns:\\\"/var/run/netns/79ecb181-da14-4b06-8eee-cd1d2284cc90\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-77d77644d-fthkq;K8S_POD_INFRA_CONTAINER_ID=0bcf5e1ff5e9deea43b5b596d1ca21a80d3e627374bce06ef72f28d5f49f13aa;K8S_POD_UID=2ad671df-e651-45de-99a6-49c0f62a6891\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-77d77644d-fthkq] networking: Multus: [openshift-controller-manager/controller-manager-77d77644d-fthkq/2ad671df-e651-45de-99a6-49c0f62a6891]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-77d77644d-fthkq in out of cluster comm: pod \\\"controller-manager-77d77644d-fthkq\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" podUID="2ad671df-e651-45de-99a6-49c0f62a6891" Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.085193 4939 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager_b474ab64-fb2c-4a3c-be42-0a2c888e60e5_0(77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c): error adding pod openshift-route-controller-manager_route-controller-manager-57f7fc8d4c-jsn6b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c" Netns:"/var/run/netns/ede85f4c-15a9-4a45-9e1d-2b294ce119a0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-57f7fc8d4c-jsn6b;K8S_POD_INFRA_CONTAINER_ID=77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c;K8S_POD_UID=b474ab64-fb2c-4a3c-be42-0a2c888e60e5" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b/b474ab64-fb2c-4a3c-be42-0a2c888e60e5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-57f7fc8d4c-jsn6b in out of cluster comm: pod "route-controller-manager-57f7fc8d4c-jsn6b" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.085264 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager_b474ab64-fb2c-4a3c-be42-0a2c888e60e5_0(77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c): error adding pod openshift-route-controller-manager_route-controller-manager-57f7fc8d4c-jsn6b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c" Netns:"/var/run/netns/ede85f4c-15a9-4a45-9e1d-2b294ce119a0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-57f7fc8d4c-jsn6b;K8S_POD_INFRA_CONTAINER_ID=77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c;K8S_POD_UID=b474ab64-fb2c-4a3c-be42-0a2c888e60e5" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b/b474ab64-fb2c-4a3c-be42-0a2c888e60e5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-57f7fc8d4c-jsn6b in out of cluster comm: pod "route-controller-manager-57f7fc8d4c-jsn6b" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.085286 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 15:43:00 crc kubenswrapper[4939]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager_b474ab64-fb2c-4a3c-be42-0a2c888e60e5_0(77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c): error adding pod openshift-route-controller-manager_route-controller-manager-57f7fc8d4c-jsn6b to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c" Netns:"/var/run/netns/ede85f4c-15a9-4a45-9e1d-2b294ce119a0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-57f7fc8d4c-jsn6b;K8S_POD_INFRA_CONTAINER_ID=77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c;K8S_POD_UID=b474ab64-fb2c-4a3c-be42-0a2c888e60e5" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b/b474ab64-fb2c-4a3c-be42-0a2c888e60e5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-57f7fc8d4c-jsn6b in out of cluster comm: pod "route-controller-manager-57f7fc8d4c-jsn6b" not found Mar 18 15:43:00 crc kubenswrapper[4939]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 15:43:00 crc kubenswrapper[4939]: > pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:00 crc kubenswrapper[4939]: E0318 15:43:00.085343 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager(b474ab64-fb2c-4a3c-be42-0a2c888e60e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager(b474ab64-fb2c-4a3c-be42-0a2c888e60e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-57f7fc8d4c-jsn6b_openshift-route-controller-manager_b474ab64-fb2c-4a3c-be42-0a2c888e60e5_0(77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c): error adding pod openshift-route-controller-manager_route-controller-manager-57f7fc8d4c-jsn6b to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c\\\" Netns:\\\"/var/run/netns/ede85f4c-15a9-4a45-9e1d-2b294ce119a0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-57f7fc8d4c-jsn6b;K8S_POD_INFRA_CONTAINER_ID=77d8f90db8c15734133d6ab562f4a893ff8f3a77a11dfe3d57050c35ea78b41c;K8S_POD_UID=b474ab64-fb2c-4a3c-be42-0a2c888e60e5\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b] networking: Multus: [openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b/b474ab64-fb2c-4a3c-be42-0a2c888e60e5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-57f7fc8d4c-jsn6b in out of cluster comm: pod \\\"route-controller-manager-57f7fc8d4c-jsn6b\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" podUID="b474ab64-fb2c-4a3c-be42-0a2c888e60e5" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.163618 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.263759 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.268533 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.435333 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.567423 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.761361 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.775181 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.811976 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.812057 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.812255 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.812413 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.812603 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.916020 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.975814 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:43:00 crc kubenswrapper[4939]: I0318 15:43:00.989990 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.027743 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.037575 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.082283 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.147744 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.164691 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.300476 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.320441 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.350414 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.375238 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.394702 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.478019 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.512351 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.555800 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.620182 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.652537 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.656305 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.664427 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:43:01 crc kubenswrapper[4939]: I0318 15:43:01.801955 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.368650 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.381152 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.592722 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.640585 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.641360 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.702361 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.710028 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.763609 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.767904 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.795941 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:43:02 crc kubenswrapper[4939]: I0318 15:43:02.905642 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.038918 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.061960 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.187774 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.488810 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d77644d-fthkq"] Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.628863 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b"] Mar 18 15:43:03 crc kubenswrapper[4939]: W0318 15:43:03.649069 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb474ab64_fb2c_4a3c_be42_0a2c888e60e5.slice/crio-a24ff1740e625a23677e9db3d72e4426087414368393ba9b7deaa0a89cc93c91 WatchSource:0}: Error finding container a24ff1740e625a23677e9db3d72e4426087414368393ba9b7deaa0a89cc93c91: Status 404 returned error can't find the container with id a24ff1740e625a23677e9db3d72e4426087414368393ba9b7deaa0a89cc93c91 Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.781551 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.830264 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" event={"ID":"b474ab64-fb2c-4a3c-be42-0a2c888e60e5","Type":"ContainerStarted","Data":"b369d6a00e9d2338a63dd91a73f7be90583e6769075a3b252ffed04b376423e3"} Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.830317 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" event={"ID":"b474ab64-fb2c-4a3c-be42-0a2c888e60e5","Type":"ContainerStarted","Data":"a24ff1740e625a23677e9db3d72e4426087414368393ba9b7deaa0a89cc93c91"} Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.830569 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.832493 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" event={"ID":"2ad671df-e651-45de-99a6-49c0f62a6891","Type":"ContainerStarted","Data":"ea207168c37a89d7c12817a78a853bede11399c9d8cc7b952062a0dcea85510b"} Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.832570 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" event={"ID":"2ad671df-e651-45de-99a6-49c0f62a6891","Type":"ContainerStarted","Data":"fc25271e32a18fd335c4687ccb140a5ef0f0d0ccc67b3851f398bf502de01058"} Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.832727 4939 patch_prober.go:28] interesting pod/route-controller-manager-57f7fc8d4c-jsn6b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.832772 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" podUID="b474ab64-fb2c-4a3c-be42-0a2c888e60e5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.833284 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.840666 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.853259 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" podStartSLOduration=48.853235225 podStartE2EDuration="48.853235225s" podCreationTimestamp="2026-03-18 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:43:03.849170497 +0000 UTC m=+348.448358138" watchObservedRunningTime="2026-03-18 15:43:03.853235225 +0000 UTC m=+348.452422876" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.868653 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77d77644d-fthkq" podStartSLOduration=48.868633144 podStartE2EDuration="48.868633144s" podCreationTimestamp="2026-03-18 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:43:03.866705408 +0000 UTC m=+348.465893069" watchObservedRunningTime="2026-03-18 15:43:03.868633144 +0000 UTC m=+348.467820775" Mar 18 15:43:03 crc kubenswrapper[4939]: I0318 15:43:03.947365 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:43:04 crc kubenswrapper[4939]: I0318 15:43:04.310894 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:43:04 crc kubenswrapper[4939]: I0318 15:43:04.841453 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.558801 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.558905 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.663424 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.663554 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.663613 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.663637 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.663711 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.664068 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.664127 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.665039 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.665155 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.678864 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.766054 4939 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.766104 4939 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.766115 4939 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.766126 4939 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.766134 4939 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.776582 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.869803 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.869866 4939 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff" exitCode=137 Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.869988 4939 scope.go:117] "RemoveContainer" containerID="5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.869993 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.892663 4939 scope.go:117] "RemoveContainer" containerID="5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff" Mar 18 15:43:05 crc kubenswrapper[4939]: E0318 15:43:05.893220 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff\": container with ID starting with 5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff not found: ID does not exist" containerID="5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff" Mar 18 15:43:05 crc kubenswrapper[4939]: I0318 15:43:05.893257 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff"} err="failed to get container status \"5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff\": rpc error: code = NotFound desc = could not find container \"5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff\": container with ID starting with 5ebd1f676b47491d3940ae8d73fa3fea8b005bff134ba12032da6ddc232f32ff not found: ID does not exist" Mar 18 15:43:06 crc kubenswrapper[4939]: I0318 15:43:06.140658 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 15:43:12 crc kubenswrapper[4939]: I0318 15:43:12.133715 4939 scope.go:117] "RemoveContainer" containerID="f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87" Mar 18 15:43:12 crc kubenswrapper[4939]: E0318 15:43:12.134911 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.132796 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.133299 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.556326 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-ccc74cc7-48fs8"] Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.919034 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" event={"ID":"78b62a89-1cb9-4ce8-ab83-7bb362a98783","Type":"ContainerStarted","Data":"9f8ec039658cbfb8958761fe0abdfa71c5dc639c97a4afc08110c20ad0dfd1ca"} Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.919084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" event={"ID":"78b62a89-1cb9-4ce8-ab83-7bb362a98783","Type":"ContainerStarted","Data":"08eee7c2340078865c1c0c60c8e96d205a4edf2f10e3c762f9bef524cd5930a8"} Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.922049 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:43:14 crc kubenswrapper[4939]: I0318 15:43:14.944173 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" podStartSLOduration=86.944150485 podStartE2EDuration="1m26.944150485s" podCreationTimestamp="2026-03-18 15:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:43:14.938894702 +0000 UTC m=+359.538082343" watchObservedRunningTime="2026-03-18 15:43:14.944150485 +0000 UTC m=+359.543338116" Mar 18 15:43:15 crc kubenswrapper[4939]: I0318 15:43:15.259458 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-ccc74cc7-48fs8" Mar 18 15:43:22 crc kubenswrapper[4939]: I0318 15:43:22.490951 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:43:24 crc kubenswrapper[4939]: I0318 15:43:24.133687 4939 scope.go:117] "RemoveContainer" containerID="f04afe17007e85a1d9fd37b45101a915ce210050712b5a6aa656d799260f3e87" Mar 18 15:43:24 crc kubenswrapper[4939]: I0318 15:43:24.974451 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 15:43:24 crc kubenswrapper[4939]: I0318 15:43:24.975003 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dc8fd4179ab9a2683c0f50307a84d34dc35ad9d340ef376790c10ea71aceeda6"} Mar 18 15:43:26 crc kubenswrapper[4939]: I0318 15:43:26.218726 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:43:27 crc kubenswrapper[4939]: I0318 15:43:27.992997 4939 generic.go:334] "Generic (PLEG): container finished" podID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerID="d37c5e68a2dcf25b1365721fc0cb97b084ebf2ef0a20dacf24da23584cc09ba1" exitCode=0 Mar 18 15:43:27 crc kubenswrapper[4939]: I0318 15:43:27.993064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerDied","Data":"d37c5e68a2dcf25b1365721fc0cb97b084ebf2ef0a20dacf24da23584cc09ba1"} Mar 18 15:43:27 crc kubenswrapper[4939]: I0318 15:43:27.993822 4939 scope.go:117] "RemoveContainer" containerID="d37c5e68a2dcf25b1365721fc0cb97b084ebf2ef0a20dacf24da23584cc09ba1" Mar 18 15:43:29 crc kubenswrapper[4939]: I0318 15:43:29.004612 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerStarted","Data":"e9ff04c83cd653b53989319fd004339ad535d8431a625501b94947f4613d3229"} Mar 18 15:43:29 crc kubenswrapper[4939]: I0318 15:43:29.005769 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:43:29 crc kubenswrapper[4939]: I0318 15:43:29.010183 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.183051 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564144-c79jl"] Mar 18 15:44:00 crc kubenswrapper[4939]: E0318 15:44:00.183908 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.183921 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.184025 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.184420 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.186305 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.186558 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.187069 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.196249 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-c79jl"] Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.294419 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmpw\" (UniqueName: \"kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw\") pod \"auto-csr-approver-29564144-c79jl\" (UID: \"f333da50-173a-479e-a341-32342a2c1673\") " pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.396020 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmpw\" (UniqueName: \"kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw\") pod \"auto-csr-approver-29564144-c79jl\" (UID: \"f333da50-173a-479e-a341-32342a2c1673\") " pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.413058 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmpw\" (UniqueName: \"kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw\") pod \"auto-csr-approver-29564144-c79jl\" (UID: \"f333da50-173a-479e-a341-32342a2c1673\") " pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.514718 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:00 crc kubenswrapper[4939]: I0318 15:44:00.922149 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-c79jl"] Mar 18 15:44:01 crc kubenswrapper[4939]: I0318 15:44:01.202946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-c79jl" event={"ID":"f333da50-173a-479e-a341-32342a2c1673","Type":"ContainerStarted","Data":"7b505141f0d1fced2c636360293977834cc80fa6d1153d9281967860c532eed6"} Mar 18 15:44:03 crc kubenswrapper[4939]: I0318 15:44:03.219849 4939 generic.go:334] "Generic (PLEG): container finished" podID="f333da50-173a-479e-a341-32342a2c1673" containerID="9a2b8a76b324f748f6ac3af22b41359c9ce113f99008a17fd5d22177d43f596d" exitCode=0 Mar 18 15:44:03 crc kubenswrapper[4939]: I0318 15:44:03.219946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-c79jl" event={"ID":"f333da50-173a-479e-a341-32342a2c1673","Type":"ContainerDied","Data":"9a2b8a76b324f748f6ac3af22b41359c9ce113f99008a17fd5d22177d43f596d"} Mar 18 15:44:04 crc kubenswrapper[4939]: I0318 15:44:04.540240 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:04 crc kubenswrapper[4939]: I0318 15:44:04.666079 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmpw\" (UniqueName: \"kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw\") pod \"f333da50-173a-479e-a341-32342a2c1673\" (UID: \"f333da50-173a-479e-a341-32342a2c1673\") " Mar 18 15:44:04 crc kubenswrapper[4939]: I0318 15:44:04.672547 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw" (OuterVolumeSpecName: "kube-api-access-wvmpw") pod "f333da50-173a-479e-a341-32342a2c1673" (UID: "f333da50-173a-479e-a341-32342a2c1673"). InnerVolumeSpecName "kube-api-access-wvmpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:04 crc kubenswrapper[4939]: I0318 15:44:04.767453 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmpw\" (UniqueName: \"kubernetes.io/projected/f333da50-173a-479e-a341-32342a2c1673-kube-api-access-wvmpw\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:05 crc kubenswrapper[4939]: I0318 15:44:05.235115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-c79jl" event={"ID":"f333da50-173a-479e-a341-32342a2c1673","Type":"ContainerDied","Data":"7b505141f0d1fced2c636360293977834cc80fa6d1153d9281967860c532eed6"} Mar 18 15:44:05 crc kubenswrapper[4939]: I0318 15:44:05.235159 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b505141f0d1fced2c636360293977834cc80fa6d1153d9281967860c532eed6" Mar 18 15:44:05 crc kubenswrapper[4939]: I0318 15:44:05.235201 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-c79jl" Mar 18 15:44:23 crc kubenswrapper[4939]: I0318 15:44:23.687647 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:44:23 crc kubenswrapper[4939]: I0318 15:44:23.688086 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.123176 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.124140 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4cq6w" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="registry-server" containerID="cri-o://872c45e71dd57ffa689269ba07b582501ffd82f73b9c1b6656179ab3f808ba23" gracePeriod=30 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.133096 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.133301 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-thb6s" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="registry-server" containerID="cri-o://0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" gracePeriod=30 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.150389 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.150679 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" containerID="cri-o://e9ff04c83cd653b53989319fd004339ad535d8431a625501b94947f4613d3229" gracePeriod=30 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.159745 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.159974 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qrxf" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="registry-server" containerID="cri-o://fced34e9b98f2f294b56f8f82e7b4a47572a846eb794f12cab4a66405ce6d3a8" gracePeriod=30 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.171833 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6p9v6"] Mar 18 15:44:39 crc kubenswrapper[4939]: E0318 15:44:39.172263 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f333da50-173a-479e-a341-32342a2c1673" containerName="oc" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.172359 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f333da50-173a-479e-a341-32342a2c1673" containerName="oc" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.172621 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f333da50-173a-479e-a341-32342a2c1673" containerName="oc" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.173253 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.174902 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.175114 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ldms8" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="registry-server" containerID="cri-o://52976223ba826460bf2666c07bb9d6d8a2922a5bd33bde71b40ac2215e76310a" gracePeriod=30 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.179277 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6p9v6"] Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.252122 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.252201 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kn4j\" (UniqueName: \"kubernetes.io/projected/00562936-861a-4e78-b01d-35b9ae9a8b2a-kube-api-access-8kn4j\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.252285 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.354221 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.354277 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kn4j\" (UniqueName: \"kubernetes.io/projected/00562936-861a-4e78-b01d-35b9ae9a8b2a-kube-api-access-8kn4j\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.354339 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.358425 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.361590 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00562936-861a-4e78-b01d-35b9ae9a8b2a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: E0318 15:44:39.374576 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357 is running failed: container process not found" containerID="0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:44:39 crc kubenswrapper[4939]: E0318 15:44:39.375730 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357 is running failed: container process not found" containerID="0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:44:39 crc kubenswrapper[4939]: E0318 15:44:39.376036 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357 is running failed: container process not found" containerID="0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:44:39 crc kubenswrapper[4939]: E0318 15:44:39.376071 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-thb6s" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="registry-server" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.383268 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kn4j\" (UniqueName: \"kubernetes.io/projected/00562936-861a-4e78-b01d-35b9ae9a8b2a-kube-api-access-8kn4j\") pod \"marketplace-operator-79b997595-6p9v6\" (UID: \"00562936-861a-4e78-b01d-35b9ae9a8b2a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.427110 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerID="872c45e71dd57ffa689269ba07b582501ffd82f73b9c1b6656179ab3f808ba23" exitCode=0 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.427192 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerDied","Data":"872c45e71dd57ffa689269ba07b582501ffd82f73b9c1b6656179ab3f808ba23"} Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.429367 4939 generic.go:334] "Generic (PLEG): container finished" podID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerID="fced34e9b98f2f294b56f8f82e7b4a47572a846eb794f12cab4a66405ce6d3a8" exitCode=0 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.429418 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerDied","Data":"fced34e9b98f2f294b56f8f82e7b4a47572a846eb794f12cab4a66405ce6d3a8"} Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.433391 4939 generic.go:334] "Generic (PLEG): container finished" podID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerID="0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" exitCode=0 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.433446 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerDied","Data":"0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357"} Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.444206 4939 generic.go:334] "Generic (PLEG): container finished" podID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerID="52976223ba826460bf2666c07bb9d6d8a2922a5bd33bde71b40ac2215e76310a" exitCode=0 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.444274 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerDied","Data":"52976223ba826460bf2666c07bb9d6d8a2922a5bd33bde71b40ac2215e76310a"} Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.447178 4939 generic.go:334] "Generic (PLEG): container finished" podID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerID="e9ff04c83cd653b53989319fd004339ad535d8431a625501b94947f4613d3229" exitCode=0 Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.447217 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerDied","Data":"e9ff04c83cd653b53989319fd004339ad535d8431a625501b94947f4613d3229"} Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.447254 4939 scope.go:117] "RemoveContainer" containerID="d37c5e68a2dcf25b1365721fc0cb97b084ebf2ef0a20dacf24da23584cc09ba1" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.622601 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.627757 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.634415 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658108 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content\") pod \"c9a596c4-2674-4c46-ab00-c8167b950bc9\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658175 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca\") pod \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658207 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities\") pod \"c9a596c4-2674-4c46-ab00-c8167b950bc9\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658233 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psf4w\" (UniqueName: \"kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w\") pod \"c9a596c4-2674-4c46-ab00-c8167b950bc9\" (UID: \"c9a596c4-2674-4c46-ab00-c8167b950bc9\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658262 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlfv\" (UniqueName: \"kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv\") pod \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.658303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics\") pod \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\" (UID: \"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.663969 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities" (OuterVolumeSpecName: "utilities") pod "c9a596c4-2674-4c46-ab00-c8167b950bc9" (UID: "c9a596c4-2674-4c46-ab00-c8167b950bc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.664594 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" (UID: "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.666487 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" (UID: "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.668345 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv" (OuterVolumeSpecName: "kube-api-access-7tlfv") pod "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" (UID: "73a63cbc-ba9e-44d7-97c7-c15c9c809cdb"). InnerVolumeSpecName "kube-api-access-7tlfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.668619 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w" (OuterVolumeSpecName: "kube-api-access-psf4w") pod "c9a596c4-2674-4c46-ab00-c8167b950bc9" (UID: "c9a596c4-2674-4c46-ab00-c8167b950bc9"). InnerVolumeSpecName "kube-api-access-psf4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.679376 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.703723 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.708296 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.726164 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a596c4-2674-4c46-ab00-c8167b950bc9" (UID: "c9a596c4-2674-4c46-ab00-c8167b950bc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities\") pod \"18db329a-84bc-4bb2-94a4-00053cc542e7\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759415 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vnps\" (UniqueName: \"kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps\") pod \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759460 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hv7v\" (UniqueName: \"kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v\") pod \"18db329a-84bc-4bb2-94a4-00053cc542e7\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759541 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content\") pod \"18db329a-84bc-4bb2-94a4-00053cc542e7\" (UID: \"18db329a-84bc-4bb2-94a4-00053cc542e7\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759589 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content\") pod \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities\") pod \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\" (UID: \"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759675 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content\") pod \"e48cb944-8d0d-4169-aa48-947c2654df5a\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759703 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities\") pod \"e48cb944-8d0d-4169-aa48-947c2654df5a\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.759738 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cktwx\" (UniqueName: \"kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx\") pod \"e48cb944-8d0d-4169-aa48-947c2654df5a\" (UID: \"e48cb944-8d0d-4169-aa48-947c2654df5a\") " Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760032 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760090 4939 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760111 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a596c4-2674-4c46-ab00-c8167b950bc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760123 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psf4w\" (UniqueName: \"kubernetes.io/projected/c9a596c4-2674-4c46-ab00-c8167b950bc9-kube-api-access-psf4w\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760135 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlfv\" (UniqueName: \"kubernetes.io/projected/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-kube-api-access-7tlfv\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.760148 4939 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.761972 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities" (OuterVolumeSpecName: "utilities") pod "18db329a-84bc-4bb2-94a4-00053cc542e7" (UID: "18db329a-84bc-4bb2-94a4-00053cc542e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.766267 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx" (OuterVolumeSpecName: "kube-api-access-cktwx") pod "e48cb944-8d0d-4169-aa48-947c2654df5a" (UID: "e48cb944-8d0d-4169-aa48-947c2654df5a"). InnerVolumeSpecName "kube-api-access-cktwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.766343 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v" (OuterVolumeSpecName: "kube-api-access-6hv7v") pod "18db329a-84bc-4bb2-94a4-00053cc542e7" (UID: "18db329a-84bc-4bb2-94a4-00053cc542e7"). InnerVolumeSpecName "kube-api-access-6hv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.767176 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities" (OuterVolumeSpecName: "utilities") pod "e48cb944-8d0d-4169-aa48-947c2654df5a" (UID: "e48cb944-8d0d-4169-aa48-947c2654df5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.767273 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps" (OuterVolumeSpecName: "kube-api-access-6vnps") pod "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" (UID: "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464"). InnerVolumeSpecName "kube-api-access-6vnps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.769716 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities" (OuterVolumeSpecName: "utilities") pod "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" (UID: "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.837124 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" (UID: "5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.841648 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18db329a-84bc-4bb2-94a4-00053cc542e7" (UID: "18db329a-84bc-4bb2-94a4-00053cc542e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861160 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861188 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861198 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861206 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861216 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cktwx\" (UniqueName: \"kubernetes.io/projected/e48cb944-8d0d-4169-aa48-947c2654df5a-kube-api-access-cktwx\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861227 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18db329a-84bc-4bb2-94a4-00053cc542e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861235 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vnps\" (UniqueName: \"kubernetes.io/projected/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464-kube-api-access-6vnps\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.861243 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hv7v\" (UniqueName: \"kubernetes.io/projected/18db329a-84bc-4bb2-94a4-00053cc542e7-kube-api-access-6hv7v\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.922719 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e48cb944-8d0d-4169-aa48-947c2654df5a" (UID: "e48cb944-8d0d-4169-aa48-947c2654df5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:44:39 crc kubenswrapper[4939]: I0318 15:44:39.962550 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48cb944-8d0d-4169-aa48-947c2654df5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.104153 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6p9v6"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.454566 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cq6w" event={"ID":"5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464","Type":"ContainerDied","Data":"b55a5310b7b6c708a39b2804933f27cd4e133cdc8fc42b35d0d09021ea05e894"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.454676 4939 scope.go:117] "RemoveContainer" containerID="872c45e71dd57ffa689269ba07b582501ffd82f73b9c1b6656179ab3f808ba23" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.454597 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cq6w" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.455874 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" event={"ID":"00562936-861a-4e78-b01d-35b9ae9a8b2a","Type":"ContainerStarted","Data":"98b0466c244d233d4943d2d3ec1e60fe2de1f0197923a6b6e8e648721c64ef42"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.455915 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" event={"ID":"00562936-861a-4e78-b01d-35b9ae9a8b2a","Type":"ContainerStarted","Data":"1807621dc73d275b1727f3122ef4de5ab420109dd355b10d597148ad6562f58e"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.456106 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.457534 4939 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6p9v6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.457577 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" podUID="00562936-861a-4e78-b01d-35b9ae9a8b2a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.461191 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qrxf" event={"ID":"18db329a-84bc-4bb2-94a4-00053cc542e7","Type":"ContainerDied","Data":"3592f60084dc912f5e9d91de141ec648ae05360d9f9fdcfea7caf0debf6c0b33"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.461319 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qrxf" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.477755 4939 scope.go:117] "RemoveContainer" containerID="01a5091740014a14cf9305e910c86f71f2e8880ca532c040a09ea606595e9399" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.479292 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldms8" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.479399 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldms8" event={"ID":"e48cb944-8d0d-4169-aa48-947c2654df5a","Type":"ContainerDied","Data":"4a5b80c8de5a70f4ae3f18b2984b7a9c48be07636a76e19472624ad5dd969d9c"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.492332 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thb6s" event={"ID":"c9a596c4-2674-4c46-ab00-c8167b950bc9","Type":"ContainerDied","Data":"c30addb0291ddbb3d26f067f476d1b5fee21cc02e8d0cbc60387394b90f76bea"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.492479 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thb6s" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.493187 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" podStartSLOduration=1.493164623 podStartE2EDuration="1.493164623s" podCreationTimestamp="2026-03-18 15:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:44:40.483733537 +0000 UTC m=+445.082921168" watchObservedRunningTime="2026-03-18 15:44:40.493164623 +0000 UTC m=+445.092352264" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.494839 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" event={"ID":"73a63cbc-ba9e-44d7-97c7-c15c9c809cdb","Type":"ContainerDied","Data":"20150ca69b9c0ccb699e1f8b44e5a322aecb7f29049f4584f40f083961deb443"} Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.494891 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qv8l5" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.511232 4939 scope.go:117] "RemoveContainer" containerID="f8b7508186e9d83bbea7c614274b0fae9803d08b92eab35edc2baac9a4771ad6" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.514693 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.520391 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4cq6w"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.533151 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.540841 4939 scope.go:117] "RemoveContainer" containerID="fced34e9b98f2f294b56f8f82e7b4a47572a846eb794f12cab4a66405ce6d3a8" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.545900 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ldms8"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.548934 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.551933 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qrxf"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.559486 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.563361 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qv8l5"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.571922 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.572119 4939 scope.go:117] "RemoveContainer" containerID="bb9223b64b2b3a13d35dc8b8ee61373d612afbfc8ae39a754dee7646378e840f" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.576592 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-thb6s"] Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.588831 4939 scope.go:117] "RemoveContainer" containerID="0e781de8f192733cf4a8681bb266763ea069074c3740d97245f4f629b3b003ba" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.605627 4939 scope.go:117] "RemoveContainer" containerID="52976223ba826460bf2666c07bb9d6d8a2922a5bd33bde71b40ac2215e76310a" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.631332 4939 scope.go:117] "RemoveContainer" containerID="b4acbe31a06ea3f4279b05cf8d7be6cf36033b487140c804ab4bbf2b73e435d0" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.649708 4939 scope.go:117] "RemoveContainer" containerID="0f5377b1391c25f9593e81559613cd1811d1776d73020ac9ae33e99f0154a19e" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.666939 4939 scope.go:117] "RemoveContainer" containerID="0bd1b5dec78048038ae61e2a17241b88b17b4c37bff8553ab039fd7b89151357" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.685137 4939 scope.go:117] "RemoveContainer" containerID="817c1dc723c488a3a50f64c6f4c2aae7334ad8dfd652641a2f7f503f0db26bab" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.706055 4939 scope.go:117] "RemoveContainer" containerID="cfd232f01b3b09566807d15d6078096dbdd5837c9630b7a2e14a3bb726912579" Mar 18 15:44:40 crc kubenswrapper[4939]: I0318 15:44:40.723754 4939 scope.go:117] "RemoveContainer" containerID="e9ff04c83cd653b53989319fd004339ad535d8431a625501b94947f4613d3229" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181107 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nn6nx"] Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181334 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181348 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181359 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181366 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181376 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181385 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181392 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181398 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181408 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181416 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181429 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181436 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181446 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181453 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181461 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181467 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181479 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181486 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="extract-content" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181521 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181532 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181546 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181554 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181564 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181571 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181580 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181587 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="extract-utilities" Mar 18 15:44:41 crc kubenswrapper[4939]: E0318 15:44:41.181729 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181737 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181866 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181877 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181886 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" containerName="marketplace-operator" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181895 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181909 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.181917 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" containerName="registry-server" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.182741 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.185796 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.194455 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn6nx"] Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.280539 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-utilities\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.280617 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxqm\" (UniqueName: \"kubernetes.io/projected/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-kube-api-access-sfxqm\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.280690 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-catalog-content\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.381594 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-utilities\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.381678 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxqm\" (UniqueName: \"kubernetes.io/projected/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-kube-api-access-sfxqm\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.381905 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-catalog-content\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.382379 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-catalog-content\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.382384 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-utilities\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.414584 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxqm\" (UniqueName: \"kubernetes.io/projected/5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b-kube-api-access-sfxqm\") pod \"certified-operators-nn6nx\" (UID: \"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b\") " pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.497416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:41 crc kubenswrapper[4939]: I0318 15:44:41.549915 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6p9v6" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.749573 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn6nx"] Mar 18 15:44:42 crc kubenswrapper[4939]: W0318 15:44:41.755447 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2f4fd1_4ecf_4deb_9a6c_1de07faf8a2b.slice/crio-89b3347c47a743f3bfa944ee9b4cc3d168a18c333683a0656ad27afb98ca2098 WatchSource:0}: Error finding container 89b3347c47a743f3bfa944ee9b4cc3d168a18c333683a0656ad27afb98ca2098: Status 404 returned error can't find the container with id 89b3347c47a743f3bfa944ee9b4cc3d168a18c333683a0656ad27afb98ca2098 Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.779383 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ch9n8"] Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.780637 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.784744 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch9n8"] Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.784883 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.889531 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-catalog-content\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.889597 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-utilities\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.889643 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p8t\" (UniqueName: \"kubernetes.io/projected/144cb12d-acd2-4981-ac55-e3ae8682cec6-kube-api-access-78p8t\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.990746 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-catalog-content\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.990818 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-utilities\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.990864 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p8t\" (UniqueName: \"kubernetes.io/projected/144cb12d-acd2-4981-ac55-e3ae8682cec6-kube-api-access-78p8t\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.991395 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-catalog-content\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:41.991472 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144cb12d-acd2-4981-ac55-e3ae8682cec6-utilities\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.011047 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p8t\" (UniqueName: \"kubernetes.io/projected/144cb12d-acd2-4981-ac55-e3ae8682cec6-kube-api-access-78p8t\") pod \"community-operators-ch9n8\" (UID: \"144cb12d-acd2-4981-ac55-e3ae8682cec6\") " pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.102169 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.140423 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18db329a-84bc-4bb2-94a4-00053cc542e7" path="/var/lib/kubelet/pods/18db329a-84bc-4bb2-94a4-00053cc542e7/volumes" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.141217 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464" path="/var/lib/kubelet/pods/5e5bcb6d-89cd-4f7d-84b4-4b5f9079e464/volumes" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.141958 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a63cbc-ba9e-44d7-97c7-c15c9c809cdb" path="/var/lib/kubelet/pods/73a63cbc-ba9e-44d7-97c7-c15c9c809cdb/volumes" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.143090 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a596c4-2674-4c46-ab00-c8167b950bc9" path="/var/lib/kubelet/pods/c9a596c4-2674-4c46-ab00-c8167b950bc9/volumes" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.143772 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48cb944-8d0d-4169-aa48-947c2654df5a" path="/var/lib/kubelet/pods/e48cb944-8d0d-4169-aa48-947c2654df5a/volumes" Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.437066 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch9n8"] Mar 18 15:44:42 crc kubenswrapper[4939]: W0318 15:44:42.443701 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144cb12d_acd2_4981_ac55_e3ae8682cec6.slice/crio-3703b4e3238debb9737c34c034f69d0b60dc41ce216cf24923bc53a849c171b1 WatchSource:0}: Error finding container 3703b4e3238debb9737c34c034f69d0b60dc41ce216cf24923bc53a849c171b1: Status 404 returned error can't find the container with id 3703b4e3238debb9737c34c034f69d0b60dc41ce216cf24923bc53a849c171b1 Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.555220 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b" containerID="34b4f7b552703b7a7adf7e0d92334c1ecbdd83ada05a1ae5d0af499625ce2403" exitCode=0 Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.555376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn6nx" event={"ID":"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b","Type":"ContainerDied","Data":"34b4f7b552703b7a7adf7e0d92334c1ecbdd83ada05a1ae5d0af499625ce2403"} Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.555407 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn6nx" event={"ID":"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b","Type":"ContainerStarted","Data":"89b3347c47a743f3bfa944ee9b4cc3d168a18c333683a0656ad27afb98ca2098"} Mar 18 15:44:42 crc kubenswrapper[4939]: I0318 15:44:42.557839 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch9n8" event={"ID":"144cb12d-acd2-4981-ac55-e3ae8682cec6","Type":"ContainerStarted","Data":"3703b4e3238debb9737c34c034f69d0b60dc41ce216cf24923bc53a849c171b1"} Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.571868 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b" containerID="ea1396ed30c60b5c71af95ca41348cf99a8ea2499e74c3eeef39afdc97977847" exitCode=0 Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.571919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn6nx" event={"ID":"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b","Type":"ContainerDied","Data":"ea1396ed30c60b5c71af95ca41348cf99a8ea2499e74c3eeef39afdc97977847"} Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.581608 4939 generic.go:334] "Generic (PLEG): container finished" podID="144cb12d-acd2-4981-ac55-e3ae8682cec6" containerID="0c7c6999dc7de839d9e433ee9306ecb25a283ca089da6bdd1de1a052b21a642c" exitCode=0 Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.581658 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch9n8" event={"ID":"144cb12d-acd2-4981-ac55-e3ae8682cec6","Type":"ContainerDied","Data":"0c7c6999dc7de839d9e433ee9306ecb25a283ca089da6bdd1de1a052b21a642c"} Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.583186 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2cwz"] Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.586126 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.591841 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2cwz"] Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.594780 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.717764 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfpj\" (UniqueName: \"kubernetes.io/projected/1264055e-c4dc-4675-a79e-2b158edd8733-kube-api-access-nrfpj\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.717814 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-catalog-content\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.717841 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-utilities\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.819553 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfpj\" (UniqueName: \"kubernetes.io/projected/1264055e-c4dc-4675-a79e-2b158edd8733-kube-api-access-nrfpj\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.820103 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-catalog-content\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.820313 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-utilities\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.820746 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-catalog-content\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.821239 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264055e-c4dc-4675-a79e-2b158edd8733-utilities\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.843236 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfpj\" (UniqueName: \"kubernetes.io/projected/1264055e-c4dc-4675-a79e-2b158edd8733-kube-api-access-nrfpj\") pod \"redhat-marketplace-d2cwz\" (UID: \"1264055e-c4dc-4675-a79e-2b158edd8733\") " pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:43 crc kubenswrapper[4939]: I0318 15:44:43.928144 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.177246 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.178519 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.185473 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.215613 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.221106 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2cwz"] Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.326975 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.327333 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.327370 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6qms\" (UniqueName: \"kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.372971 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fkt4l"] Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.373558 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.389168 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fkt4l"] Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.428617 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.428696 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.428730 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6qms\" (UniqueName: \"kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.429193 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.429240 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.452652 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6qms\" (UniqueName: \"kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms\") pod \"redhat-operators-r86xf\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529713 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-registry-certificates\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529776 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde6b933-b707-4720-abe0-3d90f9e03108-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529804 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgbdx\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-kube-api-access-lgbdx\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529831 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-trusted-ca\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529909 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-registry-tls\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529937 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde6b933-b707-4720-abe0-3d90f9e03108-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.529959 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-bound-sa-token\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.530024 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.546730 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.558285 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.590382 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch9n8" event={"ID":"144cb12d-acd2-4981-ac55-e3ae8682cec6","Type":"ContainerStarted","Data":"291c62f2fffbda8247b0151af5aa2d50d109a2b72f403c70cccbef0e17963db9"} Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.594403 4939 generic.go:334] "Generic (PLEG): container finished" podID="1264055e-c4dc-4675-a79e-2b158edd8733" containerID="28fafb8693f5e6603e26d7dda9b2359c8b270f3d064bd0734edb51813f0ff836" exitCode=0 Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.594455 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2cwz" event={"ID":"1264055e-c4dc-4675-a79e-2b158edd8733","Type":"ContainerDied","Data":"28fafb8693f5e6603e26d7dda9b2359c8b270f3d064bd0734edb51813f0ff836"} Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.594475 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2cwz" event={"ID":"1264055e-c4dc-4675-a79e-2b158edd8733","Type":"ContainerStarted","Data":"b5e8706a42764ab288f115c0692c3fc56efc32fdfe122b0714689b6ab8eac6a5"} Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.598409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn6nx" event={"ID":"5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b","Type":"ContainerStarted","Data":"3819dd396cd58f4ab0a795cd836ae69f210bb623b501d2aa64b88e95f66f6682"} Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631653 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-registry-certificates\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631695 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde6b933-b707-4720-abe0-3d90f9e03108-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631730 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgbdx\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-kube-api-access-lgbdx\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631767 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-trusted-ca\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631796 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-registry-tls\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631824 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde6b933-b707-4720-abe0-3d90f9e03108-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.631842 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-bound-sa-token\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.633334 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-trusted-ca\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.633923 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bde6b933-b707-4720-abe0-3d90f9e03108-registry-certificates\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.634549 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bde6b933-b707-4720-abe0-3d90f9e03108-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.634817 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nn6nx" podStartSLOduration=2.218772597 podStartE2EDuration="3.634791666s" podCreationTimestamp="2026-03-18 15:44:41 +0000 UTC" firstStartedPulling="2026-03-18 15:44:42.556718742 +0000 UTC m=+447.155906363" lastFinishedPulling="2026-03-18 15:44:43.972737811 +0000 UTC m=+448.571925432" observedRunningTime="2026-03-18 15:44:44.631315515 +0000 UTC m=+449.230503156" watchObservedRunningTime="2026-03-18 15:44:44.634791666 +0000 UTC m=+449.233979287" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.637733 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-registry-tls\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.638450 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bde6b933-b707-4720-abe0-3d90f9e03108-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.663709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-bound-sa-token\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.672860 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgbdx\" (UniqueName: \"kubernetes.io/projected/bde6b933-b707-4720-abe0-3d90f9e03108-kube-api-access-lgbdx\") pod \"image-registry-66df7c8f76-fkt4l\" (UID: \"bde6b933-b707-4720-abe0-3d90f9e03108\") " pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.704844 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.910849 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fkt4l"] Mar 18 15:44:44 crc kubenswrapper[4939]: W0318 15:44:44.920962 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde6b933_b707_4720_abe0_3d90f9e03108.slice/crio-0a1594d02cd15cbec56c8becc41ffffd95be5438fa5786084f06480a06ac813c WatchSource:0}: Error finding container 0a1594d02cd15cbec56c8becc41ffffd95be5438fa5786084f06480a06ac813c: Status 404 returned error can't find the container with id 0a1594d02cd15cbec56c8becc41ffffd95be5438fa5786084f06480a06ac813c Mar 18 15:44:44 crc kubenswrapper[4939]: I0318 15:44:44.951943 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 15:44:44 crc kubenswrapper[4939]: W0318 15:44:44.959164 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda34fc0b_1bbf_40ea_aa12_536963bcac3a.slice/crio-b25b784dba148374c46c7dfe9ddedc09c48a138a24383360ea3b3c6d05e12df3 WatchSource:0}: Error finding container b25b784dba148374c46c7dfe9ddedc09c48a138a24383360ea3b3c6d05e12df3: Status 404 returned error can't find the container with id b25b784dba148374c46c7dfe9ddedc09c48a138a24383360ea3b3c6d05e12df3 Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.606120 4939 generic.go:334] "Generic (PLEG): container finished" podID="144cb12d-acd2-4981-ac55-e3ae8682cec6" containerID="291c62f2fffbda8247b0151af5aa2d50d109a2b72f403c70cccbef0e17963db9" exitCode=0 Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.606537 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch9n8" event={"ID":"144cb12d-acd2-4981-ac55-e3ae8682cec6","Type":"ContainerDied","Data":"291c62f2fffbda8247b0151af5aa2d50d109a2b72f403c70cccbef0e17963db9"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.616334 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2cwz" event={"ID":"1264055e-c4dc-4675-a79e-2b158edd8733","Type":"ContainerStarted","Data":"093da67f87f1b1ac6199ef76850c5a10cf3c4e236e61720ec5b8949e3214adaa"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.621824 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" event={"ID":"bde6b933-b707-4720-abe0-3d90f9e03108","Type":"ContainerStarted","Data":"50e11989217690fa23d30ed65b1601fb1ad290afa83b94707c879aab837b36a3"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.621875 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" event={"ID":"bde6b933-b707-4720-abe0-3d90f9e03108","Type":"ContainerStarted","Data":"0a1594d02cd15cbec56c8becc41ffffd95be5438fa5786084f06480a06ac813c"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.621967 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.628757 4939 generic.go:334] "Generic (PLEG): container finished" podID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerID="fa3f687c31e512fe3b23789960ec8fa000d2d83b2140a9a744d079de5cabdbae" exitCode=0 Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.628820 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerDied","Data":"fa3f687c31e512fe3b23789960ec8fa000d2d83b2140a9a744d079de5cabdbae"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.628866 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerStarted","Data":"b25b784dba148374c46c7dfe9ddedc09c48a138a24383360ea3b3c6d05e12df3"} Mar 18 15:44:45 crc kubenswrapper[4939]: I0318 15:44:45.669072 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" podStartSLOduration=1.6690577279999999 podStartE2EDuration="1.669057728s" podCreationTimestamp="2026-03-18 15:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:44:45.666120363 +0000 UTC m=+450.265308004" watchObservedRunningTime="2026-03-18 15:44:45.669057728 +0000 UTC m=+450.268245349" Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.636324 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch9n8" event={"ID":"144cb12d-acd2-4981-ac55-e3ae8682cec6","Type":"ContainerStarted","Data":"fbdb6994a0736e491867df75663b9126f4b77c65f93a7c7aae6911fb524c7b10"} Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.638830 4939 generic.go:334] "Generic (PLEG): container finished" podID="1264055e-c4dc-4675-a79e-2b158edd8733" containerID="093da67f87f1b1ac6199ef76850c5a10cf3c4e236e61720ec5b8949e3214adaa" exitCode=0 Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.638957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2cwz" event={"ID":"1264055e-c4dc-4675-a79e-2b158edd8733","Type":"ContainerDied","Data":"093da67f87f1b1ac6199ef76850c5a10cf3c4e236e61720ec5b8949e3214adaa"} Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.639012 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2cwz" event={"ID":"1264055e-c4dc-4675-a79e-2b158edd8733","Type":"ContainerStarted","Data":"76c73f2c41295a0cc02952f8f736ece222f450cb9c30ddfe96a6c36da6c4d717"} Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.656399 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ch9n8" podStartSLOduration=3.142098676 podStartE2EDuration="5.656375129s" podCreationTimestamp="2026-03-18 15:44:41 +0000 UTC" firstStartedPulling="2026-03-18 15:44:43.583201107 +0000 UTC m=+448.182388758" lastFinishedPulling="2026-03-18 15:44:46.09747759 +0000 UTC m=+450.696665211" observedRunningTime="2026-03-18 15:44:46.655362939 +0000 UTC m=+451.254550570" watchObservedRunningTime="2026-03-18 15:44:46.656375129 +0000 UTC m=+451.255562760" Mar 18 15:44:46 crc kubenswrapper[4939]: I0318 15:44:46.677339 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2cwz" podStartSLOduration=2.11968983 podStartE2EDuration="3.677319011s" podCreationTimestamp="2026-03-18 15:44:43 +0000 UTC" firstStartedPulling="2026-03-18 15:44:44.597064323 +0000 UTC m=+449.196251944" lastFinishedPulling="2026-03-18 15:44:46.154693474 +0000 UTC m=+450.753881125" observedRunningTime="2026-03-18 15:44:46.674648033 +0000 UTC m=+451.273835684" watchObservedRunningTime="2026-03-18 15:44:46.677319011 +0000 UTC m=+451.276506632" Mar 18 15:44:47 crc kubenswrapper[4939]: I0318 15:44:47.658213 4939 generic.go:334] "Generic (PLEG): container finished" podID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerID="2958f486e59c88a81b55051d15a9fdeaf1fb6b4f4b2635c9ad6b3647671e82eb" exitCode=0 Mar 18 15:44:47 crc kubenswrapper[4939]: I0318 15:44:47.658287 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerDied","Data":"2958f486e59c88a81b55051d15a9fdeaf1fb6b4f4b2635c9ad6b3647671e82eb"} Mar 18 15:44:48 crc kubenswrapper[4939]: I0318 15:44:48.668958 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerStarted","Data":"aac46ba8f8dbb47c1ec077b7f7afe2a19dd05cbe13287723809b110361994f0d"} Mar 18 15:44:48 crc kubenswrapper[4939]: I0318 15:44:48.686157 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r86xf" podStartSLOduration=2.27596803 podStartE2EDuration="4.686142s" podCreationTimestamp="2026-03-18 15:44:44 +0000 UTC" firstStartedPulling="2026-03-18 15:44:45.635139916 +0000 UTC m=+450.234327537" lastFinishedPulling="2026-03-18 15:44:48.045313876 +0000 UTC m=+452.644501507" observedRunningTime="2026-03-18 15:44:48.683975766 +0000 UTC m=+453.283163437" watchObservedRunningTime="2026-03-18 15:44:48.686142 +0000 UTC m=+453.285329631" Mar 18 15:44:51 crc kubenswrapper[4939]: I0318 15:44:51.497957 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:51 crc kubenswrapper[4939]: I0318 15:44:51.498022 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:51 crc kubenswrapper[4939]: I0318 15:44:51.546814 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:51 crc kubenswrapper[4939]: I0318 15:44:51.739379 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nn6nx" Mar 18 15:44:52 crc kubenswrapper[4939]: I0318 15:44:52.103022 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:52 crc kubenswrapper[4939]: I0318 15:44:52.103079 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:52 crc kubenswrapper[4939]: I0318 15:44:52.149372 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:52 crc kubenswrapper[4939]: I0318 15:44:52.758628 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ch9n8" Mar 18 15:44:53 crc kubenswrapper[4939]: I0318 15:44:53.688076 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:44:53 crc kubenswrapper[4939]: I0318 15:44:53.688188 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:44:53 crc kubenswrapper[4939]: I0318 15:44:53.928960 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:53 crc kubenswrapper[4939]: I0318 15:44:53.929079 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:53 crc kubenswrapper[4939]: I0318 15:44:53.983029 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:54 crc kubenswrapper[4939]: I0318 15:44:54.547034 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:54 crc kubenswrapper[4939]: I0318 15:44:54.547338 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:44:54 crc kubenswrapper[4939]: I0318 15:44:54.755176 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2cwz" Mar 18 15:44:55 crc kubenswrapper[4939]: I0318 15:44:55.590117 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r86xf" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="registry-server" probeResult="failure" output=< Mar 18 15:44:55 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 15:44:55 crc kubenswrapper[4939]: > Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.149210 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2"] Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.151021 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.154037 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.156726 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.186703 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2"] Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.244787 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.244916 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.245023 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.346087 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.346180 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.346252 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.347647 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.355034 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.366526 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm\") pod \"collect-profiles-29564145-w5hx2\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.499624 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:00 crc kubenswrapper[4939]: I0318 15:45:00.930127 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2"] Mar 18 15:45:00 crc kubenswrapper[4939]: W0318 15:45:00.935702 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b727364_2e38_4dd3_9a89_fc571104ebe9.slice/crio-73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5 WatchSource:0}: Error finding container 73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5: Status 404 returned error can't find the container with id 73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5 Mar 18 15:45:01 crc kubenswrapper[4939]: I0318 15:45:01.749726 4939 generic.go:334] "Generic (PLEG): container finished" podID="5b727364-2e38-4dd3-9a89-fc571104ebe9" containerID="f588d6951f7649ae8a119e35d427a2cf8840360e83fcde7cefdbd2455cb7316c" exitCode=0 Mar 18 15:45:01 crc kubenswrapper[4939]: I0318 15:45:01.749803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" event={"ID":"5b727364-2e38-4dd3-9a89-fc571104ebe9","Type":"ContainerDied","Data":"f588d6951f7649ae8a119e35d427a2cf8840360e83fcde7cefdbd2455cb7316c"} Mar 18 15:45:01 crc kubenswrapper[4939]: I0318 15:45:01.750147 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" event={"ID":"5b727364-2e38-4dd3-9a89-fc571104ebe9","Type":"ContainerStarted","Data":"73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5"} Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.056483 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.197009 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume\") pod \"5b727364-2e38-4dd3-9a89-fc571104ebe9\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.197094 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume\") pod \"5b727364-2e38-4dd3-9a89-fc571104ebe9\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.197123 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm\") pod \"5b727364-2e38-4dd3-9a89-fc571104ebe9\" (UID: \"5b727364-2e38-4dd3-9a89-fc571104ebe9\") " Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.197973 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume" (OuterVolumeSpecName: "config-volume") pod "5b727364-2e38-4dd3-9a89-fc571104ebe9" (UID: "5b727364-2e38-4dd3-9a89-fc571104ebe9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.205925 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5b727364-2e38-4dd3-9a89-fc571104ebe9" (UID: "5b727364-2e38-4dd3-9a89-fc571104ebe9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.206377 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm" (OuterVolumeSpecName: "kube-api-access-5rmkm") pod "5b727364-2e38-4dd3-9a89-fc571104ebe9" (UID: "5b727364-2e38-4dd3-9a89-fc571104ebe9"). InnerVolumeSpecName "kube-api-access-5rmkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.298087 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b727364-2e38-4dd3-9a89-fc571104ebe9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.298136 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b727364-2e38-4dd3-9a89-fc571104ebe9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.298148 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rmkm\" (UniqueName: \"kubernetes.io/projected/5b727364-2e38-4dd3-9a89-fc571104ebe9-kube-api-access-5rmkm\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.766041 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" event={"ID":"5b727364-2e38-4dd3-9a89-fc571104ebe9","Type":"ContainerDied","Data":"73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5"} Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.766077 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73369cd1e363c7a193c17e9c00c604646eb4d130f9cca3db14beaf769af345e5" Mar 18 15:45:03 crc kubenswrapper[4939]: I0318 15:45:03.766117 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2" Mar 18 15:45:04 crc kubenswrapper[4939]: I0318 15:45:04.622678 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:45:04 crc kubenswrapper[4939]: I0318 15:45:04.679866 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 15:45:04 crc kubenswrapper[4939]: I0318 15:45:04.716204 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fkt4l" Mar 18 15:45:04 crc kubenswrapper[4939]: I0318 15:45:04.771068 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.687206 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.687915 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.687983 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.688877 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.688992 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b" gracePeriod=600 Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.891913 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b" exitCode=0 Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.891975 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b"} Mar 18 15:45:23 crc kubenswrapper[4939]: I0318 15:45:23.892035 4939 scope.go:117] "RemoveContainer" containerID="c7e7a75f3983599f383fe21540117c2acdd02f19d35ade744e354589c23d9999" Mar 18 15:45:24 crc kubenswrapper[4939]: I0318 15:45:24.901834 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd"} Mar 18 15:45:29 crc kubenswrapper[4939]: I0318 15:45:29.813494 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" podUID="0048288d-ec58-4cf8-a68a-b73b98db9d01" containerName="registry" containerID="cri-o://c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827" gracePeriod=30 Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.248866 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394364 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394761 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7dpx\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394820 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394861 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394897 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.394930 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.395057 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.395091 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca\") pod \"0048288d-ec58-4cf8-a68a-b73b98db9d01\" (UID: \"0048288d-ec58-4cf8-a68a-b73b98db9d01\") " Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.396105 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.396262 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.401767 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.401779 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.402480 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx" (OuterVolumeSpecName: "kube-api-access-d7dpx") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "kube-api-access-d7dpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.411972 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.413583 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.413843 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0048288d-ec58-4cf8-a68a-b73b98db9d01" (UID: "0048288d-ec58-4cf8-a68a-b73b98db9d01"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496768 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7dpx\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-kube-api-access-d7dpx\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496815 4939 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0048288d-ec58-4cf8-a68a-b73b98db9d01-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496829 4939 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0048288d-ec58-4cf8-a68a-b73b98db9d01-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496842 4939 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496854 4939 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0048288d-ec58-4cf8-a68a-b73b98db9d01-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496866 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.496879 4939 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0048288d-ec58-4cf8-a68a-b73b98db9d01-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.945382 4939 generic.go:334] "Generic (PLEG): container finished" podID="0048288d-ec58-4cf8-a68a-b73b98db9d01" containerID="c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827" exitCode=0 Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.945448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" event={"ID":"0048288d-ec58-4cf8-a68a-b73b98db9d01","Type":"ContainerDied","Data":"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827"} Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.945473 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.945555 4939 scope.go:117] "RemoveContainer" containerID="c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.945491 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t62m5" event={"ID":"0048288d-ec58-4cf8-a68a-b73b98db9d01","Type":"ContainerDied","Data":"9bfef898c511e896be4628e78b58fe79b36a8faaf782f80593b690a2d69e8d67"} Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.975420 4939 scope.go:117] "RemoveContainer" containerID="c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827" Mar 18 15:45:30 crc kubenswrapper[4939]: E0318 15:45:30.976200 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827\": container with ID starting with c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827 not found: ID does not exist" containerID="c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.976251 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827"} err="failed to get container status \"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827\": rpc error: code = NotFound desc = could not find container \"c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827\": container with ID starting with c443c2b3d9d23006c66af955b77259a17801c7a604068ce805f73089e0714827 not found: ID does not exist" Mar 18 15:45:30 crc kubenswrapper[4939]: I0318 15:45:30.998366 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:45:31 crc kubenswrapper[4939]: I0318 15:45:31.006312 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t62m5"] Mar 18 15:45:32 crc kubenswrapper[4939]: I0318 15:45:32.141436 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0048288d-ec58-4cf8-a68a-b73b98db9d01" path="/var/lib/kubelet/pods/0048288d-ec58-4cf8-a68a-b73b98db9d01/volumes" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.146203 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564146-9mxtc"] Mar 18 15:46:00 crc kubenswrapper[4939]: E0318 15:46:00.147243 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b727364-2e38-4dd3-9a89-fc571104ebe9" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.147267 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b727364-2e38-4dd3-9a89-fc571104ebe9" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4939]: E0318 15:46:00.147302 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0048288d-ec58-4cf8-a68a-b73b98db9d01" containerName="registry" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.147315 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0048288d-ec58-4cf8-a68a-b73b98db9d01" containerName="registry" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.147470 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0048288d-ec58-4cf8-a68a-b73b98db9d01" containerName="registry" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.147493 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b727364-2e38-4dd3-9a89-fc571104ebe9" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.149373 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.153940 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.154433 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.155007 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.160402 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-9mxtc"] Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.313887 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7z2r\" (UniqueName: \"kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r\") pod \"auto-csr-approver-29564146-9mxtc\" (UID: \"f2e4874d-a1a5-4433-aea0-782d689ce0f1\") " pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.415098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7z2r\" (UniqueName: \"kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r\") pod \"auto-csr-approver-29564146-9mxtc\" (UID: \"f2e4874d-a1a5-4433-aea0-782d689ce0f1\") " pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.433855 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7z2r\" (UniqueName: \"kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r\") pod \"auto-csr-approver-29564146-9mxtc\" (UID: \"f2e4874d-a1a5-4433-aea0-782d689ce0f1\") " pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.487761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.730710 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-9mxtc"] Mar 18 15:46:00 crc kubenswrapper[4939]: I0318 15:46:00.738459 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:46:01 crc kubenswrapper[4939]: I0318 15:46:01.163039 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" event={"ID":"f2e4874d-a1a5-4433-aea0-782d689ce0f1","Type":"ContainerStarted","Data":"bed6ad43f409a12f3e6febdd83a513ba5a9bdf2d876f4fcdff01789234f17256"} Mar 18 15:46:02 crc kubenswrapper[4939]: I0318 15:46:02.171089 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" event={"ID":"f2e4874d-a1a5-4433-aea0-782d689ce0f1","Type":"ContainerStarted","Data":"4421ac7535cc87a2236360b8433184f5ebb164b2c5dde8fd8ef44e910b5c4f86"} Mar 18 15:46:02 crc kubenswrapper[4939]: I0318 15:46:02.186024 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" podStartSLOduration=1.074490064 podStartE2EDuration="2.186009289s" podCreationTimestamp="2026-03-18 15:46:00 +0000 UTC" firstStartedPulling="2026-03-18 15:46:00.738274865 +0000 UTC m=+525.337462486" lastFinishedPulling="2026-03-18 15:46:01.84979406 +0000 UTC m=+526.448981711" observedRunningTime="2026-03-18 15:46:02.181742303 +0000 UTC m=+526.780929934" watchObservedRunningTime="2026-03-18 15:46:02.186009289 +0000 UTC m=+526.785196920" Mar 18 15:46:03 crc kubenswrapper[4939]: I0318 15:46:03.180396 4939 generic.go:334] "Generic (PLEG): container finished" podID="f2e4874d-a1a5-4433-aea0-782d689ce0f1" containerID="4421ac7535cc87a2236360b8433184f5ebb164b2c5dde8fd8ef44e910b5c4f86" exitCode=0 Mar 18 15:46:03 crc kubenswrapper[4939]: I0318 15:46:03.180450 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" event={"ID":"f2e4874d-a1a5-4433-aea0-782d689ce0f1","Type":"ContainerDied","Data":"4421ac7535cc87a2236360b8433184f5ebb164b2c5dde8fd8ef44e910b5c4f86"} Mar 18 15:46:04 crc kubenswrapper[4939]: I0318 15:46:04.402079 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:04 crc kubenswrapper[4939]: I0318 15:46:04.568211 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7z2r\" (UniqueName: \"kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r\") pod \"f2e4874d-a1a5-4433-aea0-782d689ce0f1\" (UID: \"f2e4874d-a1a5-4433-aea0-782d689ce0f1\") " Mar 18 15:46:04 crc kubenswrapper[4939]: I0318 15:46:04.579824 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r" (OuterVolumeSpecName: "kube-api-access-n7z2r") pod "f2e4874d-a1a5-4433-aea0-782d689ce0f1" (UID: "f2e4874d-a1a5-4433-aea0-782d689ce0f1"). InnerVolumeSpecName "kube-api-access-n7z2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:46:04 crc kubenswrapper[4939]: I0318 15:46:04.669464 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7z2r\" (UniqueName: \"kubernetes.io/projected/f2e4874d-a1a5-4433-aea0-782d689ce0f1-kube-api-access-n7z2r\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:05 crc kubenswrapper[4939]: I0318 15:46:05.196547 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" event={"ID":"f2e4874d-a1a5-4433-aea0-782d689ce0f1","Type":"ContainerDied","Data":"bed6ad43f409a12f3e6febdd83a513ba5a9bdf2d876f4fcdff01789234f17256"} Mar 18 15:46:05 crc kubenswrapper[4939]: I0318 15:46:05.196596 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed6ad43f409a12f3e6febdd83a513ba5a9bdf2d876f4fcdff01789234f17256" Mar 18 15:46:05 crc kubenswrapper[4939]: I0318 15:46:05.196650 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-9mxtc" Mar 18 15:46:05 crc kubenswrapper[4939]: I0318 15:46:05.258419 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vqqr2"] Mar 18 15:46:05 crc kubenswrapper[4939]: I0318 15:46:05.267343 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-vqqr2"] Mar 18 15:46:06 crc kubenswrapper[4939]: I0318 15:46:06.144376 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330c585e-3a67-4502-b800-7401df959334" path="/var/lib/kubelet/pods/330c585e-3a67-4502-b800-7401df959334/volumes" Mar 18 15:47:23 crc kubenswrapper[4939]: I0318 15:47:23.687686 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:47:23 crc kubenswrapper[4939]: I0318 15:47:23.688652 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:47:53 crc kubenswrapper[4939]: I0318 15:47:53.687907 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:47:53 crc kubenswrapper[4939]: I0318 15:47:53.688634 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.130886 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564148-w7npf"] Mar 18 15:48:00 crc kubenswrapper[4939]: E0318 15:48:00.131697 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e4874d-a1a5-4433-aea0-782d689ce0f1" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.131711 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e4874d-a1a5-4433-aea0-782d689ce0f1" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.131799 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e4874d-a1a5-4433-aea0-782d689ce0f1" containerName="oc" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.132401 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.136357 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbrx\" (UniqueName: \"kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx\") pod \"auto-csr-approver-29564148-w7npf\" (UID: \"49714dd1-eccd-480b-836b-29c1e2c6eb83\") " pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.137998 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.138074 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.138170 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.140625 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-w7npf"] Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.237698 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbrx\" (UniqueName: \"kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx\") pod \"auto-csr-approver-29564148-w7npf\" (UID: \"49714dd1-eccd-480b-836b-29c1e2c6eb83\") " pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.264196 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbrx\" (UniqueName: \"kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx\") pod \"auto-csr-approver-29564148-w7npf\" (UID: \"49714dd1-eccd-480b-836b-29c1e2c6eb83\") " pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.455801 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.651183 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-w7npf"] Mar 18 15:48:00 crc kubenswrapper[4939]: I0318 15:48:00.912347 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-w7npf" event={"ID":"49714dd1-eccd-480b-836b-29c1e2c6eb83","Type":"ContainerStarted","Data":"433889c4fb0842a31961c1dca31bbc578adb79f8cf9e1dfafeec67d170f885d2"} Mar 18 15:48:01 crc kubenswrapper[4939]: I0318 15:48:01.932448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-w7npf" event={"ID":"49714dd1-eccd-480b-836b-29c1e2c6eb83","Type":"ContainerStarted","Data":"ad94198ca87be1d20d9c756385962f09f0ca08a813c461d3b89ff528e7b56e0f"} Mar 18 15:48:01 crc kubenswrapper[4939]: I0318 15:48:01.947553 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564148-w7npf" podStartSLOduration=1.036279058 podStartE2EDuration="1.947530532s" podCreationTimestamp="2026-03-18 15:48:00 +0000 UTC" firstStartedPulling="2026-03-18 15:48:00.662389109 +0000 UTC m=+645.261576730" lastFinishedPulling="2026-03-18 15:48:01.573640583 +0000 UTC m=+646.172828204" observedRunningTime="2026-03-18 15:48:01.946681417 +0000 UTC m=+646.545869038" watchObservedRunningTime="2026-03-18 15:48:01.947530532 +0000 UTC m=+646.546718173" Mar 18 15:48:02 crc kubenswrapper[4939]: I0318 15:48:02.941450 4939 generic.go:334] "Generic (PLEG): container finished" podID="49714dd1-eccd-480b-836b-29c1e2c6eb83" containerID="ad94198ca87be1d20d9c756385962f09f0ca08a813c461d3b89ff528e7b56e0f" exitCode=0 Mar 18 15:48:02 crc kubenswrapper[4939]: I0318 15:48:02.941767 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-w7npf" event={"ID":"49714dd1-eccd-480b-836b-29c1e2c6eb83","Type":"ContainerDied","Data":"ad94198ca87be1d20d9c756385962f09f0ca08a813c461d3b89ff528e7b56e0f"} Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.177234 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.193097 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjbrx\" (UniqueName: \"kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx\") pod \"49714dd1-eccd-480b-836b-29c1e2c6eb83\" (UID: \"49714dd1-eccd-480b-836b-29c1e2c6eb83\") " Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.202109 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx" (OuterVolumeSpecName: "kube-api-access-bjbrx") pod "49714dd1-eccd-480b-836b-29c1e2c6eb83" (UID: "49714dd1-eccd-480b-836b-29c1e2c6eb83"). InnerVolumeSpecName "kube-api-access-bjbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.293925 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjbrx\" (UniqueName: \"kubernetes.io/projected/49714dd1-eccd-480b-836b-29c1e2c6eb83-kube-api-access-bjbrx\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.956580 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-w7npf" event={"ID":"49714dd1-eccd-480b-836b-29c1e2c6eb83","Type":"ContainerDied","Data":"433889c4fb0842a31961c1dca31bbc578adb79f8cf9e1dfafeec67d170f885d2"} Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.956633 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="433889c4fb0842a31961c1dca31bbc578adb79f8cf9e1dfafeec67d170f885d2" Mar 18 15:48:04 crc kubenswrapper[4939]: I0318 15:48:04.956637 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-w7npf" Mar 18 15:48:05 crc kubenswrapper[4939]: I0318 15:48:05.004459 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-9rlv8"] Mar 18 15:48:05 crc kubenswrapper[4939]: I0318 15:48:05.009381 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-9rlv8"] Mar 18 15:48:06 crc kubenswrapper[4939]: I0318 15:48:06.140158 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6573991b-28f7-4030-8a1b-734a3a8e37a7" path="/var/lib/kubelet/pods/6573991b-28f7-4030-8a1b-734a3a8e37a7/volumes" Mar 18 15:48:23 crc kubenswrapper[4939]: I0318 15:48:23.687556 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:48:23 crc kubenswrapper[4939]: I0318 15:48:23.688067 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:48:23 crc kubenswrapper[4939]: I0318 15:48:23.688111 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:48:23 crc kubenswrapper[4939]: I0318 15:48:23.688730 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:48:23 crc kubenswrapper[4939]: I0318 15:48:23.688801 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd" gracePeriod=600 Mar 18 15:48:24 crc kubenswrapper[4939]: I0318 15:48:24.074939 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd" exitCode=0 Mar 18 15:48:24 crc kubenswrapper[4939]: I0318 15:48:24.074987 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd"} Mar 18 15:48:24 crc kubenswrapper[4939]: I0318 15:48:24.075031 4939 scope.go:117] "RemoveContainer" containerID="25c45d0482bcfb57b4acc9de3abe36c8d204cdc1c2752c823face6bf93abc88b" Mar 18 15:48:25 crc kubenswrapper[4939]: I0318 15:48:25.085189 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0"} Mar 18 15:48:37 crc kubenswrapper[4939]: I0318 15:48:37.450243 4939 scope.go:117] "RemoveContainer" containerID="9a2ab29fc786c69f3ca74afa9bf789ad1ca8a360eaed26608a83aab7091a3faf" Mar 18 15:48:37 crc kubenswrapper[4939]: I0318 15:48:37.499610 4939 scope.go:117] "RemoveContainer" containerID="0cc2ec072a65103eaf5e64c00a21cb6506b62b3c10a0d3c877ca1d99851042fa" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.149332 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564150-7xdb7"] Mar 18 15:50:00 crc kubenswrapper[4939]: E0318 15:50:00.150270 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49714dd1-eccd-480b-836b-29c1e2c6eb83" containerName="oc" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.150288 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="49714dd1-eccd-480b-836b-29c1e2c6eb83" containerName="oc" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.150465 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="49714dd1-eccd-480b-836b-29c1e2c6eb83" containerName="oc" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.150999 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-7xdb7"] Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.151161 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.155805 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.156304 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.163957 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.230061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwv9f\" (UniqueName: \"kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f\") pod \"auto-csr-approver-29564150-7xdb7\" (UID: \"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b\") " pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.331222 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwv9f\" (UniqueName: \"kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f\") pod \"auto-csr-approver-29564150-7xdb7\" (UID: \"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b\") " pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.365569 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwv9f\" (UniqueName: \"kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f\") pod \"auto-csr-approver-29564150-7xdb7\" (UID: \"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b\") " pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.471142 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.664579 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-7xdb7"] Mar 18 15:50:00 crc kubenswrapper[4939]: I0318 15:50:00.692412 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" event={"ID":"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b","Type":"ContainerStarted","Data":"d1b4f7056c4a8e7bd9752449fbaad4aa55165e740ef243f8943388c7990957f2"} Mar 18 15:50:02 crc kubenswrapper[4939]: I0318 15:50:02.705740 4939 generic.go:334] "Generic (PLEG): container finished" podID="d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" containerID="8e6393f06ecf9a139081fb39cd6def0d532087077dae2ddd46636cfd01d83183" exitCode=0 Mar 18 15:50:02 crc kubenswrapper[4939]: I0318 15:50:02.705810 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" event={"ID":"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b","Type":"ContainerDied","Data":"8e6393f06ecf9a139081fb39cd6def0d532087077dae2ddd46636cfd01d83183"} Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.013548 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.079659 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwv9f\" (UniqueName: \"kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f\") pod \"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b\" (UID: \"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b\") " Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.085694 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f" (OuterVolumeSpecName: "kube-api-access-mwv9f") pod "d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" (UID: "d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b"). InnerVolumeSpecName "kube-api-access-mwv9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.180782 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwv9f\" (UniqueName: \"kubernetes.io/projected/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b-kube-api-access-mwv9f\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.721214 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" event={"ID":"d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b","Type":"ContainerDied","Data":"d1b4f7056c4a8e7bd9752449fbaad4aa55165e740ef243f8943388c7990957f2"} Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.721257 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1b4f7056c4a8e7bd9752449fbaad4aa55165e740ef243f8943388c7990957f2" Mar 18 15:50:04 crc kubenswrapper[4939]: I0318 15:50:04.721311 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-7xdb7" Mar 18 15:50:05 crc kubenswrapper[4939]: I0318 15:50:05.071067 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-c79jl"] Mar 18 15:50:05 crc kubenswrapper[4939]: I0318 15:50:05.074644 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-c79jl"] Mar 18 15:50:06 crc kubenswrapper[4939]: I0318 15:50:06.139285 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f333da50-173a-479e-a341-32342a2c1673" path="/var/lib/kubelet/pods/f333da50-173a-479e-a341-32342a2c1673/volumes" Mar 18 15:50:37 crc kubenswrapper[4939]: I0318 15:50:37.573575 4939 scope.go:117] "RemoveContainer" containerID="9a2b8a76b324f748f6ac3af22b41359c9ce113f99008a17fd5d22177d43f596d" Mar 18 15:50:53 crc kubenswrapper[4939]: I0318 15:50:53.687347 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:50:53 crc kubenswrapper[4939]: I0318 15:50:53.687821 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:51:03 crc kubenswrapper[4939]: I0318 15:51:03.806243 4939 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:51:23 crc kubenswrapper[4939]: I0318 15:51:23.687725 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:51:23 crc kubenswrapper[4939]: I0318 15:51:23.688334 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:51:53 crc kubenswrapper[4939]: I0318 15:51:53.687807 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:51:53 crc kubenswrapper[4939]: I0318 15:51:53.688259 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:51:53 crc kubenswrapper[4939]: I0318 15:51:53.688301 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:51:53 crc kubenswrapper[4939]: I0318 15:51:53.688827 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:51:53 crc kubenswrapper[4939]: I0318 15:51:53.688884 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0" gracePeriod=600 Mar 18 15:51:54 crc kubenswrapper[4939]: I0318 15:51:54.143870 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0" exitCode=0 Mar 18 15:51:54 crc kubenswrapper[4939]: I0318 15:51:54.143917 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0"} Mar 18 15:51:54 crc kubenswrapper[4939]: I0318 15:51:54.144303 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2"} Mar 18 15:51:54 crc kubenswrapper[4939]: I0318 15:51:54.144335 4939 scope.go:117] "RemoveContainer" containerID="f8d46afefccd1cc408a266166601816a4ee5a3355e3992c3936cd8a9ae1e06fd" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.147668 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564152-7cfwq"] Mar 18 15:52:00 crc kubenswrapper[4939]: E0318 15:52:00.148407 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" containerName="oc" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.148421 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" containerName="oc" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.148552 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" containerName="oc" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.148950 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-7cfwq"] Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.148983 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.150957 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.155031 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.162681 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.260109 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9bt\" (UniqueName: \"kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt\") pod \"auto-csr-approver-29564152-7cfwq\" (UID: \"bd11a960-b0cf-4dee-b4bd-46ef351172d1\") " pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.361403 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9bt\" (UniqueName: \"kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt\") pod \"auto-csr-approver-29564152-7cfwq\" (UID: \"bd11a960-b0cf-4dee-b4bd-46ef351172d1\") " pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.384370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9bt\" (UniqueName: \"kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt\") pod \"auto-csr-approver-29564152-7cfwq\" (UID: \"bd11a960-b0cf-4dee-b4bd-46ef351172d1\") " pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.464625 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.717155 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-7cfwq"] Mar 18 15:52:00 crc kubenswrapper[4939]: I0318 15:52:00.726856 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:52:01 crc kubenswrapper[4939]: I0318 15:52:01.194226 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" event={"ID":"bd11a960-b0cf-4dee-b4bd-46ef351172d1","Type":"ContainerStarted","Data":"bde0e33b7c6f4b3d5aa0c1b9c185893fee9cbff3c47ceb7d75b9ec9bf03e3c57"} Mar 18 15:52:02 crc kubenswrapper[4939]: I0318 15:52:02.201485 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" event={"ID":"bd11a960-b0cf-4dee-b4bd-46ef351172d1","Type":"ContainerStarted","Data":"dfcc60cf5eb04f85f77cdd1a79a89dc2f8ceac0b67bb2ded8f01075774c5ff17"} Mar 18 15:52:02 crc kubenswrapper[4939]: I0318 15:52:02.222607 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" podStartSLOduration=1.164898776 podStartE2EDuration="2.222590231s" podCreationTimestamp="2026-03-18 15:52:00 +0000 UTC" firstStartedPulling="2026-03-18 15:52:00.726452835 +0000 UTC m=+885.325640496" lastFinishedPulling="2026-03-18 15:52:01.78414432 +0000 UTC m=+886.383331951" observedRunningTime="2026-03-18 15:52:02.21670058 +0000 UTC m=+886.815888211" watchObservedRunningTime="2026-03-18 15:52:02.222590231 +0000 UTC m=+886.821777852" Mar 18 15:52:03 crc kubenswrapper[4939]: I0318 15:52:03.210485 4939 generic.go:334] "Generic (PLEG): container finished" podID="bd11a960-b0cf-4dee-b4bd-46ef351172d1" containerID="dfcc60cf5eb04f85f77cdd1a79a89dc2f8ceac0b67bb2ded8f01075774c5ff17" exitCode=0 Mar 18 15:52:03 crc kubenswrapper[4939]: I0318 15:52:03.210728 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" event={"ID":"bd11a960-b0cf-4dee-b4bd-46ef351172d1","Type":"ContainerDied","Data":"dfcc60cf5eb04f85f77cdd1a79a89dc2f8ceac0b67bb2ded8f01075774c5ff17"} Mar 18 15:52:04 crc kubenswrapper[4939]: I0318 15:52:04.463014 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:04 crc kubenswrapper[4939]: I0318 15:52:04.616077 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9bt\" (UniqueName: \"kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt\") pod \"bd11a960-b0cf-4dee-b4bd-46ef351172d1\" (UID: \"bd11a960-b0cf-4dee-b4bd-46ef351172d1\") " Mar 18 15:52:04 crc kubenswrapper[4939]: I0318 15:52:04.622097 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt" (OuterVolumeSpecName: "kube-api-access-hq9bt") pod "bd11a960-b0cf-4dee-b4bd-46ef351172d1" (UID: "bd11a960-b0cf-4dee-b4bd-46ef351172d1"). InnerVolumeSpecName "kube-api-access-hq9bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:04 crc kubenswrapper[4939]: I0318 15:52:04.729746 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9bt\" (UniqueName: \"kubernetes.io/projected/bd11a960-b0cf-4dee-b4bd-46ef351172d1-kube-api-access-hq9bt\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:05 crc kubenswrapper[4939]: I0318 15:52:05.232011 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" event={"ID":"bd11a960-b0cf-4dee-b4bd-46ef351172d1","Type":"ContainerDied","Data":"bde0e33b7c6f4b3d5aa0c1b9c185893fee9cbff3c47ceb7d75b9ec9bf03e3c57"} Mar 18 15:52:05 crc kubenswrapper[4939]: I0318 15:52:05.232347 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde0e33b7c6f4b3d5aa0c1b9c185893fee9cbff3c47ceb7d75b9ec9bf03e3c57" Mar 18 15:52:05 crc kubenswrapper[4939]: I0318 15:52:05.232075 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-7cfwq" Mar 18 15:52:05 crc kubenswrapper[4939]: I0318 15:52:05.270668 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-9mxtc"] Mar 18 15:52:05 crc kubenswrapper[4939]: I0318 15:52:05.273492 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-9mxtc"] Mar 18 15:52:06 crc kubenswrapper[4939]: I0318 15:52:06.140723 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e4874d-a1a5-4433-aea0-782d689ce0f1" path="/var/lib/kubelet/pods/f2e4874d-a1a5-4433-aea0-782d689ce0f1/volumes" Mar 18 15:52:37 crc kubenswrapper[4939]: I0318 15:52:37.639144 4939 scope.go:117] "RemoveContainer" containerID="4421ac7535cc87a2236360b8433184f5ebb164b2c5dde8fd8ef44e910b5c4f86" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.193069 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xgz5n"] Mar 18 15:53:24 crc kubenswrapper[4939]: E0318 15:53:24.195691 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd11a960-b0cf-4dee-b4bd-46ef351172d1" containerName="oc" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.195921 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd11a960-b0cf-4dee-b4bd-46ef351172d1" containerName="oc" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.196339 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd11a960-b0cf-4dee-b4bd-46ef351172d1" containerName="oc" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.197259 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.200799 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.201412 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.201493 4939 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-44r9v" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.202374 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.220229 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xgz5n"] Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.281132 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.281211 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.281258 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gfvt\" (UniqueName: \"kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.499467 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.499574 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.499629 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gfvt\" (UniqueName: \"kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.499939 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.501035 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.528160 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gfvt\" (UniqueName: \"kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt\") pod \"crc-storage-crc-xgz5n\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.533364 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:24 crc kubenswrapper[4939]: I0318 15:53:24.770938 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xgz5n"] Mar 18 15:53:25 crc kubenswrapper[4939]: I0318 15:53:25.775345 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xgz5n" event={"ID":"4af2ff35-a774-4708-b902-00fc5dad9c9e","Type":"ContainerStarted","Data":"1628304936cf66f10fd73e9a769dd9549bb0fc4bd2161e9597abfcdaf66fdb7d"} Mar 18 15:53:26 crc kubenswrapper[4939]: I0318 15:53:26.783167 4939 generic.go:334] "Generic (PLEG): container finished" podID="4af2ff35-a774-4708-b902-00fc5dad9c9e" containerID="1bf5f1e0a8e3cdd552733f46ed0538c55bd5977361fa8954d8df52e1800b7f9b" exitCode=0 Mar 18 15:53:26 crc kubenswrapper[4939]: I0318 15:53:26.783468 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xgz5n" event={"ID":"4af2ff35-a774-4708-b902-00fc5dad9c9e","Type":"ContainerDied","Data":"1bf5f1e0a8e3cdd552733f46ed0538c55bd5977361fa8954d8df52e1800b7f9b"} Mar 18 15:53:27 crc kubenswrapper[4939]: I0318 15:53:27.978604 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.146004 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage\") pod \"4af2ff35-a774-4708-b902-00fc5dad9c9e\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.146394 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gfvt\" (UniqueName: \"kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt\") pod \"4af2ff35-a774-4708-b902-00fc5dad9c9e\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.146442 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt\") pod \"4af2ff35-a774-4708-b902-00fc5dad9c9e\" (UID: \"4af2ff35-a774-4708-b902-00fc5dad9c9e\") " Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.146643 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4af2ff35-a774-4708-b902-00fc5dad9c9e" (UID: "4af2ff35-a774-4708-b902-00fc5dad9c9e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.152702 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt" (OuterVolumeSpecName: "kube-api-access-8gfvt") pod "4af2ff35-a774-4708-b902-00fc5dad9c9e" (UID: "4af2ff35-a774-4708-b902-00fc5dad9c9e"). InnerVolumeSpecName "kube-api-access-8gfvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.168084 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4af2ff35-a774-4708-b902-00fc5dad9c9e" (UID: "4af2ff35-a774-4708-b902-00fc5dad9c9e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.247339 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gfvt\" (UniqueName: \"kubernetes.io/projected/4af2ff35-a774-4708-b902-00fc5dad9c9e-kube-api-access-8gfvt\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.247397 4939 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4af2ff35-a774-4708-b902-00fc5dad9c9e-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.247412 4939 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4af2ff35-a774-4708-b902-00fc5dad9c9e-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.793833 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xgz5n" event={"ID":"4af2ff35-a774-4708-b902-00fc5dad9c9e","Type":"ContainerDied","Data":"1628304936cf66f10fd73e9a769dd9549bb0fc4bd2161e9597abfcdaf66fdb7d"} Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.793871 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1628304936cf66f10fd73e9a769dd9549bb0fc4bd2161e9597abfcdaf66fdb7d" Mar 18 15:53:28 crc kubenswrapper[4939]: I0318 15:53:28.793939 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xgz5n" Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.931959 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l79pv"] Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932405 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-controller" containerID="cri-o://ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932476 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="nbdb" containerID="cri-o://dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932544 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="sbdb" containerID="cri-o://785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932681 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-node" containerID="cri-o://4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932733 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932677 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-acl-logging" containerID="cri-o://ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.932732 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="northd" containerID="cri-o://bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" gracePeriod=30 Mar 18 15:53:31 crc kubenswrapper[4939]: I0318 15:53:31.985778 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" containerID="cri-o://fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" gracePeriod=30 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.301119 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/3.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.304181 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovn-acl-logging/0.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.304701 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovn-controller/0.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.305146 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.358808 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ds2wh"] Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359055 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="nbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359070 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="nbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359083 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359092 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359104 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-acl-logging" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359112 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-acl-logging" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359127 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359136 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359150 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af2ff35-a774-4708-b902-00fc5dad9c9e" containerName="storage" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359157 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af2ff35-a774-4708-b902-00fc5dad9c9e" containerName="storage" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359171 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359179 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359190 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kubecfg-setup" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359197 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kubecfg-setup" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359208 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359216 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359226 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="sbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359234 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="sbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359244 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359252 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359263 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359271 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359283 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="northd" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359291 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="northd" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359302 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-node" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359310 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-node" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359424 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359437 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359448 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-acl-logging" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359461 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359472 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovn-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359485 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359495 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="kube-rbac-proxy-node" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359524 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af2ff35-a774-4708-b902-00fc5dad9c9e" containerName="storage" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359537 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="sbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359548 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="nbdb" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359559 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="northd" Mar 18 15:53:32 crc kubenswrapper[4939]: E0318 15:53:32.359692 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359703 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.359807 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.360019 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" containerName="ovnkube-controller" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.361767 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.502780 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.502891 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.502922 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.502943 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503030 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503043 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503086 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503120 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503144 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503168 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503173 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503215 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503226 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log" (OuterVolumeSpecName: "node-log") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503194 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503245 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503265 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503274 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503303 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503373 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p47dg\" (UniqueName: \"kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503419 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503458 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503490 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503565 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503617 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503624 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503647 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503671 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503685 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503694 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503740 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503763 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash" (OuterVolumeSpecName: "host-slash") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503807 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket" (OuterVolumeSpecName: "log-socket") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503805 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn\") pod \"acafcc67-568f-415b-b907-c1de4c851fa7\" (UID: \"acafcc67-568f-415b-b907-c1de4c851fa7\") " Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503840 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503881 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503910 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503940 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/197fd38a-5fe9-4cd8-989c-53c242101570-ovn-node-metrics-cert\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503969 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6zm\" (UniqueName: \"kubernetes.io/projected/197fd38a-5fe9-4cd8-989c-53c242101570-kube-api-access-7b6zm\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.503994 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-ovn\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504016 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-node-log\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504036 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-script-lib\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504216 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-netns\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504253 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-systemd-units\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-config\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504303 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-netd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504339 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-systemd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504350 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504366 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504399 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-bin\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504571 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-slash\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504659 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-var-lib-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504711 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-kubelet\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504807 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-log-socket\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504842 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504866 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-env-overrides\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504903 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-etc-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504965 4939 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504985 4939 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.504998 4939 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505012 4939 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505110 4939 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505165 4939 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505191 4939 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505217 4939 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505245 4939 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505271 4939 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505296 4939 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505323 4939 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505356 4939 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505422 4939 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505454 4939 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505480 4939 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.505532 4939 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acafcc67-568f-415b-b907-c1de4c851fa7-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.508912 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.513106 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg" (OuterVolumeSpecName: "kube-api-access-p47dg") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "kube-api-access-p47dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.516809 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "acafcc67-568f-415b-b907-c1de4c851fa7" (UID: "acafcc67-568f-415b-b907-c1de4c851fa7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606137 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-slash\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606242 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-var-lib-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606288 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-kubelet\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606341 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-log-socket\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606340 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-slash\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606378 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606408 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-env-overrides\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606438 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-kubelet\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606447 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-etc-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606465 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-var-lib-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606493 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606571 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-log-socket\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606609 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/197fd38a-5fe9-4cd8-989c-53c242101570-ovn-node-metrics-cert\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606486 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606659 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606668 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6zm\" (UniqueName: \"kubernetes.io/projected/197fd38a-5fe9-4cd8-989c-53c242101570-kube-api-access-7b6zm\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606612 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-etc-openvswitch\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606721 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-ovn\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606764 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-node-log\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606809 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-script-lib\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606857 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-node-log\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606854 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-ovn\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606864 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-netns\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.606921 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-netns\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-systemd-units\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607078 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-config\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607134 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-netd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607214 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-systemd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607276 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-netd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607296 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-run-systemd\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607306 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607360 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607393 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-bin\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607480 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p47dg\" (UniqueName: \"kubernetes.io/projected/acafcc67-568f-415b-b907-c1de4c851fa7-kube-api-access-p47dg\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607547 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acafcc67-568f-415b-b907-c1de4c851fa7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607575 4939 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acafcc67-568f-415b-b907-c1de4c851fa7-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607549 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-env-overrides\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.607602 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-host-cni-bin\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.608349 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-script-lib\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.608444 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/197fd38a-5fe9-4cd8-989c-53c242101570-ovnkube-config\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.608491 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/197fd38a-5fe9-4cd8-989c-53c242101570-systemd-units\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.612681 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/197fd38a-5fe9-4cd8-989c-53c242101570-ovn-node-metrics-cert\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.625725 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6zm\" (UniqueName: \"kubernetes.io/projected/197fd38a-5fe9-4cd8-989c-53c242101570-kube-api-access-7b6zm\") pod \"ovnkube-node-ds2wh\" (UID: \"197fd38a-5fe9-4cd8-989c-53c242101570\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.680305 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.823221 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovnkube-controller/3.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.830383 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovn-acl-logging/0.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.831216 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l79pv_acafcc67-568f-415b-b907-c1de4c851fa7/ovn-controller/0.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832059 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832100 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832116 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832132 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832146 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832159 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" exitCode=0 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832172 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" exitCode=143 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832185 4939 generic.go:334] "Generic (PLEG): container finished" podID="acafcc67-568f-415b-b907-c1de4c851fa7" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" exitCode=143 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832254 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832320 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832341 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832359 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832395 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832411 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832424 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832436 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832447 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832460 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832470 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832482 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832494 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832547 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832571 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832585 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832596 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832607 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832618 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832642 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832653 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832664 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832675 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832686 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832709 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832768 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.832717 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833043 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833075 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833087 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833098 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833109 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833121 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833132 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833142 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833154 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l79pv" event={"ID":"acafcc67-568f-415b-b907-c1de4c851fa7","Type":"ContainerDied","Data":"832f7543688668aa2a1e8e74acdb998d00385834fa22b85119476e6485cda5b9"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833221 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833234 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833245 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833256 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833265 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833274 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833283 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833293 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833302 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.833312 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.835163 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/2.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.836077 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/1.log" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.836153 4939 generic.go:334] "Generic (PLEG): container finished" podID="6693c593-9b18-435e-8a3a-91d3e33c3c51" containerID="8c43b687c4784f3290b1295a4c03e01915b1b5c21440cdee5e547995c7fc926a" exitCode=2 Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.836241 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerDied","Data":"8c43b687c4784f3290b1295a4c03e01915b1b5c21440cdee5e547995c7fc926a"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.836291 4939 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.839669 4939 scope.go:117] "RemoveContainer" containerID="8c43b687c4784f3290b1295a4c03e01915b1b5c21440cdee5e547995c7fc926a" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.844437 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"a5ca6d00de19f2148dfb6d083bf8d50db7a150548e8403465dda627bcda93a68"} Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.903963 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.951987 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l79pv"] Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.955751 4939 scope.go:117] "RemoveContainer" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.956799 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l79pv"] Mar 18 15:53:32 crc kubenswrapper[4939]: I0318 15:53:32.981292 4939 scope.go:117] "RemoveContainer" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.004354 4939 scope.go:117] "RemoveContainer" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.020379 4939 scope.go:117] "RemoveContainer" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.035453 4939 scope.go:117] "RemoveContainer" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.050448 4939 scope.go:117] "RemoveContainer" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.064741 4939 scope.go:117] "RemoveContainer" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.081100 4939 scope.go:117] "RemoveContainer" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.129351 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.130001 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.130054 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} err="failed to get container status \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.130092 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.130649 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": container with ID starting with 6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412 not found: ID does not exist" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.130680 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} err="failed to get container status \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": rpc error: code = NotFound desc = could not find container \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": container with ID starting with 6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.130698 4939 scope.go:117] "RemoveContainer" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.132881 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": container with ID starting with 785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03 not found: ID does not exist" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.132916 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} err="failed to get container status \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": rpc error: code = NotFound desc = could not find container \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": container with ID starting with 785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.132934 4939 scope.go:117] "RemoveContainer" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.133192 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": container with ID starting with dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4 not found: ID does not exist" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.133223 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} err="failed to get container status \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": rpc error: code = NotFound desc = could not find container \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": container with ID starting with dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.133241 4939 scope.go:117] "RemoveContainer" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.133622 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": container with ID starting with bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2 not found: ID does not exist" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.133652 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} err="failed to get container status \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": rpc error: code = NotFound desc = could not find container \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": container with ID starting with bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.133670 4939 scope.go:117] "RemoveContainer" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.134091 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": container with ID starting with 189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7 not found: ID does not exist" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.134134 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} err="failed to get container status \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": rpc error: code = NotFound desc = could not find container \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": container with ID starting with 189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.134153 4939 scope.go:117] "RemoveContainer" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.134591 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": container with ID starting with 4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a not found: ID does not exist" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.134615 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} err="failed to get container status \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": rpc error: code = NotFound desc = could not find container \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": container with ID starting with 4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.134629 4939 scope.go:117] "RemoveContainer" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.135075 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": container with ID starting with ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a not found: ID does not exist" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.135151 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} err="failed to get container status \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": rpc error: code = NotFound desc = could not find container \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": container with ID starting with ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.135218 4939 scope.go:117] "RemoveContainer" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.135741 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": container with ID starting with ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336 not found: ID does not exist" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.135775 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} err="failed to get container status \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": rpc error: code = NotFound desc = could not find container \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": container with ID starting with ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.135795 4939 scope.go:117] "RemoveContainer" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: E0318 15:53:33.136265 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": container with ID starting with 24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef not found: ID does not exist" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.136321 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} err="failed to get container status \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": rpc error: code = NotFound desc = could not find container \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": container with ID starting with 24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.136353 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.136714 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} err="failed to get container status \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.136813 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137162 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} err="failed to get container status \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": rpc error: code = NotFound desc = could not find container \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": container with ID starting with 6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137216 4939 scope.go:117] "RemoveContainer" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137495 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} err="failed to get container status \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": rpc error: code = NotFound desc = could not find container \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": container with ID starting with 785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137532 4939 scope.go:117] "RemoveContainer" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137756 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} err="failed to get container status \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": rpc error: code = NotFound desc = could not find container \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": container with ID starting with dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.137788 4939 scope.go:117] "RemoveContainer" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138006 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} err="failed to get container status \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": rpc error: code = NotFound desc = could not find container \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": container with ID starting with bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138034 4939 scope.go:117] "RemoveContainer" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138256 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} err="failed to get container status \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": rpc error: code = NotFound desc = could not find container \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": container with ID starting with 189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138285 4939 scope.go:117] "RemoveContainer" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138540 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} err="failed to get container status \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": rpc error: code = NotFound desc = could not find container \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": container with ID starting with 4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138570 4939 scope.go:117] "RemoveContainer" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138788 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} err="failed to get container status \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": rpc error: code = NotFound desc = could not find container \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": container with ID starting with ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.138817 4939 scope.go:117] "RemoveContainer" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139090 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} err="failed to get container status \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": rpc error: code = NotFound desc = could not find container \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": container with ID starting with ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139120 4939 scope.go:117] "RemoveContainer" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139339 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} err="failed to get container status \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": rpc error: code = NotFound desc = could not find container \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": container with ID starting with 24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139366 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139602 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} err="failed to get container status \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139624 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139796 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} err="failed to get container status \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": rpc error: code = NotFound desc = could not find container \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": container with ID starting with 6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.139814 4939 scope.go:117] "RemoveContainer" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140029 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} err="failed to get container status \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": rpc error: code = NotFound desc = could not find container \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": container with ID starting with 785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140061 4939 scope.go:117] "RemoveContainer" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140313 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} err="failed to get container status \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": rpc error: code = NotFound desc = could not find container \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": container with ID starting with dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140335 4939 scope.go:117] "RemoveContainer" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140598 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} err="failed to get container status \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": rpc error: code = NotFound desc = could not find container \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": container with ID starting with bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140615 4939 scope.go:117] "RemoveContainer" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140919 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} err="failed to get container status \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": rpc error: code = NotFound desc = could not find container \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": container with ID starting with 189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.140952 4939 scope.go:117] "RemoveContainer" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141232 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} err="failed to get container status \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": rpc error: code = NotFound desc = could not find container \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": container with ID starting with 4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141264 4939 scope.go:117] "RemoveContainer" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141553 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} err="failed to get container status \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": rpc error: code = NotFound desc = could not find container \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": container with ID starting with ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141588 4939 scope.go:117] "RemoveContainer" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141836 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} err="failed to get container status \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": rpc error: code = NotFound desc = could not find container \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": container with ID starting with ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.141859 4939 scope.go:117] "RemoveContainer" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142106 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} err="failed to get container status \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": rpc error: code = NotFound desc = could not find container \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": container with ID starting with 24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142135 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142422 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} err="failed to get container status \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142455 4939 scope.go:117] "RemoveContainer" containerID="6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142880 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412"} err="failed to get container status \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": rpc error: code = NotFound desc = could not find container \"6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412\": container with ID starting with 6d16be56b0c7b1980f01e490ef677f639e6ff19899a7d332de5b1bd27b0a8412 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.142968 4939 scope.go:117] "RemoveContainer" containerID="785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.143427 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03"} err="failed to get container status \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": rpc error: code = NotFound desc = could not find container \"785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03\": container with ID starting with 785f9abec82049b5998be1d205c3840f59fe6ec66ae76c849f6048081c4e7a03 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.143462 4939 scope.go:117] "RemoveContainer" containerID="dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.143791 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4"} err="failed to get container status \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": rpc error: code = NotFound desc = could not find container \"dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4\": container with ID starting with dab2391f66534836283d072cc67a5fc997d090761f90aae2116160af35fd19e4 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.143844 4939 scope.go:117] "RemoveContainer" containerID="bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144200 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2"} err="failed to get container status \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": rpc error: code = NotFound desc = could not find container \"bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2\": container with ID starting with bce3afcd194361b59132bd5dc927a95aa905259d8be4a01200948e71335dd1e2 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144233 4939 scope.go:117] "RemoveContainer" containerID="189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144543 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7"} err="failed to get container status \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": rpc error: code = NotFound desc = could not find container \"189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7\": container with ID starting with 189afeeb6101ac9e98cadf8b55a96227e00e30ab9ed6230288ee6fc20e7299a7 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144567 4939 scope.go:117] "RemoveContainer" containerID="4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144809 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a"} err="failed to get container status \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": rpc error: code = NotFound desc = could not find container \"4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a\": container with ID starting with 4438a6bbcac694751ca06b2f9287641ad058e2702af90fd294159f6a99f1738a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.144848 4939 scope.go:117] "RemoveContainer" containerID="ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145185 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a"} err="failed to get container status \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": rpc error: code = NotFound desc = could not find container \"ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a\": container with ID starting with ef2f0533801363aee2a6e85300bad5fd9de5d2035ebb6b403242f140fdbd7b2a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145221 4939 scope.go:117] "RemoveContainer" containerID="ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145570 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336"} err="failed to get container status \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": rpc error: code = NotFound desc = could not find container \"ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336\": container with ID starting with ac47646684320f6cad08e129d9810b5222a8c3d708cbf0e1296dd1643e66d336 not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145612 4939 scope.go:117] "RemoveContainer" containerID="24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145913 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef"} err="failed to get container status \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": rpc error: code = NotFound desc = could not find container \"24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef\": container with ID starting with 24cd17deaadb2433cf790a5f5f5505eaeb25d5348f0ace5f6d703b163f4a2bef not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.145949 4939 scope.go:117] "RemoveContainer" containerID="fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.146240 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a"} err="failed to get container status \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": rpc error: code = NotFound desc = could not find container \"fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a\": container with ID starting with fa7cfca710a635259df1b2aa1cc2a9dc204a55cace60db1532372d89eae1405a not found: ID does not exist" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.857333 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/2.log" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.858270 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/1.log" Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.858422 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmzwg" event={"ID":"6693c593-9b18-435e-8a3a-91d3e33c3c51","Type":"ContainerStarted","Data":"2a457aa5748aacaa1484cba24cd0559c3549c09c80528db924803bce1f5432c3"} Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.862301 4939 generic.go:334] "Generic (PLEG): container finished" podID="197fd38a-5fe9-4cd8-989c-53c242101570" containerID="d7a2791b4c28bdf92d8070807669f19117d2c4b96b82f25f9e996a61c67e6747" exitCode=0 Mar 18 15:53:33 crc kubenswrapper[4939]: I0318 15:53:33.862396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerDied","Data":"d7a2791b4c28bdf92d8070807669f19117d2c4b96b82f25f9e996a61c67e6747"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.140739 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acafcc67-568f-415b-b907-c1de4c851fa7" path="/var/lib/kubelet/pods/acafcc67-568f-415b-b907-c1de4c851fa7/volumes" Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874330 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"3b8dd8c95950ea56a58802c5a9a83286e77094174d2a5e46b3b5f17cb9c87b89"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874628 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"b16581d5c56270cfce7a5671c0a7c1123cecddf4acb3bc85f250462261d36f03"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874639 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"8adab8c3dc7e0ca61defa1c9c5f5b08cff4da2f7a2f3f9d28f632c7503d8c647"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"45be446df7a2bdaf271962e897abcbdcb64e8cd24d2964ac2b163e4578983a44"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874662 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"3a6e0838e7ea1d414343604705310003afcc4493707c7e277d4c2f1a93359af1"} Mar 18 15:53:34 crc kubenswrapper[4939]: I0318 15:53:34.874672 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"8bdbdf30d4f52965e081a10d6901396c41c758fb14f0fc773f1a33a9f68c2711"} Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.835590 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn"] Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.837560 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.839344 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.947034 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.947305 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgf2\" (UniqueName: \"kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:35 crc kubenswrapper[4939]: I0318 15:53:35.947385 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.048402 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.048637 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgf2\" (UniqueName: \"kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.048767 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.049470 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.049591 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.075899 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgf2\" (UniqueName: \"kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: I0318 15:53:36.151015 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: E0318 15:53:36.187896 4939 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(97c6377ec14c55abc0711c30612615a437d98f53e18a60214dfb6b3f3ac22a08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:53:36 crc kubenswrapper[4939]: E0318 15:53:36.188427 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(97c6377ec14c55abc0711c30612615a437d98f53e18a60214dfb6b3f3ac22a08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: E0318 15:53:36.188479 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(97c6377ec14c55abc0711c30612615a437d98f53e18a60214dfb6b3f3ac22a08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:36 crc kubenswrapper[4939]: E0318 15:53:36.188719 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace(fed45129-6967-491d-9458-9480359e655d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace(fed45129-6967-491d-9458-9480359e655d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(97c6377ec14c55abc0711c30612615a437d98f53e18a60214dfb6b3f3ac22a08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" podUID="fed45129-6967-491d-9458-9480359e655d" Mar 18 15:53:37 crc kubenswrapper[4939]: I0318 15:53:37.714912 4939 scope.go:117] "RemoveContainer" containerID="afe692d71a6976c377b7b13878502a87ffe8ce46dbb5846de3ceac3abc705b2e" Mar 18 15:53:37 crc kubenswrapper[4939]: I0318 15:53:37.895687 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmzwg_6693c593-9b18-435e-8a3a-91d3e33c3c51/kube-multus/2.log" Mar 18 15:53:37 crc kubenswrapper[4939]: I0318 15:53:37.901246 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"4eb9747c8d185f0db261452c5eac15eb1b03a9cecb67369ac2e6356f5f10c0a2"} Mar 18 15:53:37 crc kubenswrapper[4939]: I0318 15:53:37.937227 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:53:37 crc kubenswrapper[4939]: I0318 15:53:37.938747 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.075972 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9td\" (UniqueName: \"kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.076108 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.076330 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.177854 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.177933 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9td\" (UniqueName: \"kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.177975 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.178388 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.178676 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.203449 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9td\" (UniqueName: \"kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td\") pod \"redhat-operators-bzq4c\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: I0318 15:53:38.267448 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: E0318 15:53:38.294314 4939 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(860bbf22c5f72e3e46c219cf4fc82a631c9b71fd5761c89e42fee1273df94ea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:53:38 crc kubenswrapper[4939]: E0318 15:53:38.294382 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(860bbf22c5f72e3e46c219cf4fc82a631c9b71fd5761c89e42fee1273df94ea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: E0318 15:53:38.294403 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(860bbf22c5f72e3e46c219cf4fc82a631c9b71fd5761c89e42fee1273df94ea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:38 crc kubenswrapper[4939]: E0318 15:53:38.294458 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-bzq4c_openshift-marketplace(306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-bzq4c_openshift-marketplace(306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(860bbf22c5f72e3e46c219cf4fc82a631c9b71fd5761c89e42fee1273df94ea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-bzq4c" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.921316 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" event={"ID":"197fd38a-5fe9-4cd8-989c-53c242101570","Type":"ContainerStarted","Data":"2c2238398b5ae4525514c5f8f372630beaa5e46c50f7e6ee3904608211f40a64"} Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.922079 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.922144 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.922163 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.967331 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.969231 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:53:39 crc kubenswrapper[4939]: I0318 15:53:39.979867 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" podStartSLOduration=7.979840523 podStartE2EDuration="7.979840523s" podCreationTimestamp="2026-03-18 15:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:53:39.975065275 +0000 UTC m=+984.574252936" watchObservedRunningTime="2026-03-18 15:53:39.979840523 +0000 UTC m=+984.579028184" Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.783545 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.783884 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.784304 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.794971 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn"] Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.795163 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:40 crc kubenswrapper[4939]: I0318 15:53:40.796034 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.846227 4939 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(60ea1dd36978cfb5b3285f26655429f7ad1c6ed8a27d3358a256da1e3ba29466): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.846672 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(60ea1dd36978cfb5b3285f26655429f7ad1c6ed8a27d3358a256da1e3ba29466): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.846703 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(60ea1dd36978cfb5b3285f26655429f7ad1c6ed8a27d3358a256da1e3ba29466): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.846775 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-bzq4c_openshift-marketplace(306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-bzq4c_openshift-marketplace(306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-bzq4c_openshift-marketplace_306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb_0(60ea1dd36978cfb5b3285f26655429f7ad1c6ed8a27d3358a256da1e3ba29466): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-bzq4c" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.874024 4939 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(0c95d63d708937b6a2ff9f6e7aad070ace1fc315023cd9599d63a44e7b8b5f4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.874087 4939 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(0c95d63d708937b6a2ff9f6e7aad070ace1fc315023cd9599d63a44e7b8b5f4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.874107 4939 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(0c95d63d708937b6a2ff9f6e7aad070ace1fc315023cd9599d63a44e7b8b5f4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:40 crc kubenswrapper[4939]: E0318 15:53:40.874153 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace(fed45129-6967-491d-9458-9480359e655d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace(fed45129-6967-491d-9458-9480359e655d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_openshift-marketplace_fed45129-6967-491d-9458-9480359e655d_0(0c95d63d708937b6a2ff9f6e7aad070ace1fc315023cd9599d63a44e7b8b5f4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" podUID="fed45129-6967-491d-9458-9480359e655d" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.752938 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.755675 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.762270 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.810154 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.810240 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.810356 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrhxr\" (UniqueName: \"kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.911458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.911872 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrhxr\" (UniqueName: \"kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.912025 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.912683 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.912905 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:46 crc kubenswrapper[4939]: I0318 15:53:46.937120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrhxr\" (UniqueName: \"kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr\") pod \"certified-operators-q6kcd\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:47 crc kubenswrapper[4939]: I0318 15:53:47.093458 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:47 crc kubenswrapper[4939]: I0318 15:53:47.317472 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:53:47 crc kubenswrapper[4939]: I0318 15:53:47.969599 4939 generic.go:334] "Generic (PLEG): container finished" podID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerID="bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799" exitCode=0 Mar 18 15:53:47 crc kubenswrapper[4939]: I0318 15:53:47.969644 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerDied","Data":"bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799"} Mar 18 15:53:47 crc kubenswrapper[4939]: I0318 15:53:47.969674 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerStarted","Data":"10b800ac9d246de2d131614c35ed2e57450cca639603d0bbcabb303fa1a2bc3d"} Mar 18 15:53:48 crc kubenswrapper[4939]: I0318 15:53:48.979021 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerStarted","Data":"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831"} Mar 18 15:53:49 crc kubenswrapper[4939]: I0318 15:53:49.989703 4939 generic.go:334] "Generic (PLEG): container finished" podID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerID="2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831" exitCode=0 Mar 18 15:53:49 crc kubenswrapper[4939]: I0318 15:53:49.989779 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerDied","Data":"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831"} Mar 18 15:53:51 crc kubenswrapper[4939]: I0318 15:53:50.999595 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerStarted","Data":"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa"} Mar 18 15:53:51 crc kubenswrapper[4939]: I0318 15:53:51.029433 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q6kcd" podStartSLOduration=2.54655516 podStartE2EDuration="5.029418088s" podCreationTimestamp="2026-03-18 15:53:46 +0000 UTC" firstStartedPulling="2026-03-18 15:53:47.971776415 +0000 UTC m=+992.570964076" lastFinishedPulling="2026-03-18 15:53:50.454639353 +0000 UTC m=+995.053827004" observedRunningTime="2026-03-18 15:53:51.026418661 +0000 UTC m=+995.625606292" watchObservedRunningTime="2026-03-18 15:53:51.029418088 +0000 UTC m=+995.628605699" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.137422 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.139365 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.149119 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.298786 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfjx\" (UniqueName: \"kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.298868 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.298948 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.400497 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfjx\" (UniqueName: \"kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.400695 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.400778 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.401240 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.401858 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.429391 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfjx\" (UniqueName: \"kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx\") pod \"redhat-marketplace-ff5wj\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.464549 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:53:53 crc kubenswrapper[4939]: I0318 15:53:53.701885 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:53:53 crc kubenswrapper[4939]: W0318 15:53:53.709238 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b02721_4602_47b1_876d_68d8a2ff1209.slice/crio-bb64618953f767917ca80c3d0e77dd24f92e339d76bfe0bcb5e36e1dc3fe46d0 WatchSource:0}: Error finding container bb64618953f767917ca80c3d0e77dd24f92e339d76bfe0bcb5e36e1dc3fe46d0: Status 404 returned error can't find the container with id bb64618953f767917ca80c3d0e77dd24f92e339d76bfe0bcb5e36e1dc3fe46d0 Mar 18 15:53:54 crc kubenswrapper[4939]: I0318 15:53:54.019095 4939 generic.go:334] "Generic (PLEG): container finished" podID="64b02721-4602-47b1-876d-68d8a2ff1209" containerID="7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943" exitCode=0 Mar 18 15:53:54 crc kubenswrapper[4939]: I0318 15:53:54.019141 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerDied","Data":"7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943"} Mar 18 15:53:54 crc kubenswrapper[4939]: I0318 15:53:54.019167 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerStarted","Data":"bb64618953f767917ca80c3d0e77dd24f92e339d76bfe0bcb5e36e1dc3fe46d0"} Mar 18 15:53:55 crc kubenswrapper[4939]: I0318 15:53:55.038592 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerStarted","Data":"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae"} Mar 18 15:53:55 crc kubenswrapper[4939]: I0318 15:53:55.132223 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:55 crc kubenswrapper[4939]: I0318 15:53:55.132878 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:53:55 crc kubenswrapper[4939]: I0318 15:53:55.381189 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:53:55 crc kubenswrapper[4939]: W0318 15:53:55.388035 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod306de6d4_1c3f_4f09_8a9c_8d38e5ce6cdb.slice/crio-b69ec966ede8f804ae44d0de82f53d501f0034e8ba560d1d6c90be61ba164df5 WatchSource:0}: Error finding container b69ec966ede8f804ae44d0de82f53d501f0034e8ba560d1d6c90be61ba164df5: Status 404 returned error can't find the container with id b69ec966ede8f804ae44d0de82f53d501f0034e8ba560d1d6c90be61ba164df5 Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.047369 4939 generic.go:334] "Generic (PLEG): container finished" podID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerID="3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3" exitCode=0 Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.048152 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerDied","Data":"3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3"} Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.048247 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerStarted","Data":"b69ec966ede8f804ae44d0de82f53d501f0034e8ba560d1d6c90be61ba164df5"} Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.056012 4939 generic.go:334] "Generic (PLEG): container finished" podID="64b02721-4602-47b1-876d-68d8a2ff1209" containerID="67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae" exitCode=0 Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.056101 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerDied","Data":"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae"} Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.133103 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.137529 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:53:56 crc kubenswrapper[4939]: I0318 15:53:56.360460 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn"] Mar 18 15:53:56 crc kubenswrapper[4939]: W0318 15:53:56.368582 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed45129_6967_491d_9458_9480359e655d.slice/crio-2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a WatchSource:0}: Error finding container 2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a: Status 404 returned error can't find the container with id 2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.062229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerStarted","Data":"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb"} Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.064562 4939 generic.go:334] "Generic (PLEG): container finished" podID="fed45129-6967-491d-9458-9480359e655d" containerID="facfb7f1c2d10085b3116cce7c0794ea8bd169db14663fc9b80e16be09e44a09" exitCode=0 Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.064598 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" event={"ID":"fed45129-6967-491d-9458-9480359e655d","Type":"ContainerDied","Data":"facfb7f1c2d10085b3116cce7c0794ea8bd169db14663fc9b80e16be09e44a09"} Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.064618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" event={"ID":"fed45129-6967-491d-9458-9480359e655d","Type":"ContainerStarted","Data":"2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a"} Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.087318 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ff5wj" podStartSLOduration=1.427929509 podStartE2EDuration="4.087297087s" podCreationTimestamp="2026-03-18 15:53:53 +0000 UTC" firstStartedPulling="2026-03-18 15:53:54.020773617 +0000 UTC m=+998.619961238" lastFinishedPulling="2026-03-18 15:53:56.680141195 +0000 UTC m=+1001.279328816" observedRunningTime="2026-03-18 15:53:57.077638458 +0000 UTC m=+1001.676826099" watchObservedRunningTime="2026-03-18 15:53:57.087297087 +0000 UTC m=+1001.686484708" Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.094281 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.094380 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:57 crc kubenswrapper[4939]: I0318 15:53:57.147534 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:58 crc kubenswrapper[4939]: I0318 15:53:58.072162 4939 generic.go:334] "Generic (PLEG): container finished" podID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerID="46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e" exitCode=0 Mar 18 15:53:58 crc kubenswrapper[4939]: I0318 15:53:58.072273 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerDied","Data":"46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e"} Mar 18 15:53:58 crc kubenswrapper[4939]: I0318 15:53:58.117275 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:53:59 crc kubenswrapper[4939]: I0318 15:53:59.082340 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerStarted","Data":"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1"} Mar 18 15:53:59 crc kubenswrapper[4939]: I0318 15:53:59.101226 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzq4c" podStartSLOduration=19.589032264 podStartE2EDuration="22.101210718s" podCreationTimestamp="2026-03-18 15:53:37 +0000 UTC" firstStartedPulling="2026-03-18 15:53:56.049568139 +0000 UTC m=+1000.648755760" lastFinishedPulling="2026-03-18 15:53:58.561746603 +0000 UTC m=+1003.160934214" observedRunningTime="2026-03-18 15:53:59.097921793 +0000 UTC m=+1003.697109424" watchObservedRunningTime="2026-03-18 15:53:59.101210718 +0000 UTC m=+1003.700398339" Mar 18 15:53:59 crc kubenswrapper[4939]: I0318 15:53:59.722425 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.091131 4939 generic.go:334] "Generic (PLEG): container finished" podID="fed45129-6967-491d-9458-9480359e655d" containerID="c3d03072ae8932ac67730ed6bca7118265ec600cc8286e82f936fc95038e285a" exitCode=0 Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.091265 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" event={"ID":"fed45129-6967-491d-9458-9480359e655d","Type":"ContainerDied","Data":"c3d03072ae8932ac67730ed6bca7118265ec600cc8286e82f936fc95038e285a"} Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.161259 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564154-7qb2m"] Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.162759 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.166175 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.167888 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.171890 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-7qb2m"] Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.172189 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.296805 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n798r\" (UniqueName: \"kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r\") pod \"auto-csr-approver-29564154-7qb2m\" (UID: \"fe0153aa-53a2-47f6-9aa8-91d1dde946ec\") " pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.397691 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n798r\" (UniqueName: \"kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r\") pod \"auto-csr-approver-29564154-7qb2m\" (UID: \"fe0153aa-53a2-47f6-9aa8-91d1dde946ec\") " pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.415287 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n798r\" (UniqueName: \"kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r\") pod \"auto-csr-approver-29564154-7qb2m\" (UID: \"fe0153aa-53a2-47f6-9aa8-91d1dde946ec\") " pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.510542 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:00 crc kubenswrapper[4939]: I0318 15:54:00.688349 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-7qb2m"] Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.099382 4939 generic.go:334] "Generic (PLEG): container finished" podID="fed45129-6967-491d-9458-9480359e655d" containerID="61bb434bc0a05bf861eed89c2d8ec93b1e3af944ab4fff16935ee61defe461ea" exitCode=0 Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.099420 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" event={"ID":"fed45129-6967-491d-9458-9480359e655d","Type":"ContainerDied","Data":"61bb434bc0a05bf861eed89c2d8ec93b1e3af944ab4fff16935ee61defe461ea"} Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.100486 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" event={"ID":"fe0153aa-53a2-47f6-9aa8-91d1dde946ec","Type":"ContainerStarted","Data":"7465171069aa12dcf83ff1e7bb1bb002913e42659d94919be12f0e9f33a3517e"} Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.100680 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q6kcd" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="registry-server" containerID="cri-o://b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa" gracePeriod=2 Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.477185 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.611846 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities\") pod \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.612147 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content\") pod \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.612195 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrhxr\" (UniqueName: \"kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr\") pod \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\" (UID: \"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6\") " Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.612736 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities" (OuterVolumeSpecName: "utilities") pod "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" (UID: "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.622581 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr" (OuterVolumeSpecName: "kube-api-access-jrhxr") pod "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" (UID: "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6"). InnerVolumeSpecName "kube-api-access-jrhxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.660171 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" (UID: "593d40be-7c2a-4c06-8dbf-8d2c78bf66b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.713175 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.713209 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:01 crc kubenswrapper[4939]: I0318 15:54:01.713221 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrhxr\" (UniqueName: \"kubernetes.io/projected/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6-kube-api-access-jrhxr\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.108070 4939 generic.go:334] "Generic (PLEG): container finished" podID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerID="b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa" exitCode=0 Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.108116 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6kcd" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.108151 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerDied","Data":"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa"} Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.110051 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6kcd" event={"ID":"593d40be-7c2a-4c06-8dbf-8d2c78bf66b6","Type":"ContainerDied","Data":"10b800ac9d246de2d131614c35ed2e57450cca639603d0bbcabb303fa1a2bc3d"} Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.110071 4939 scope.go:117] "RemoveContainer" containerID="b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.116194 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" event={"ID":"fe0153aa-53a2-47f6-9aa8-91d1dde946ec","Type":"ContainerStarted","Data":"bf9f56148ff4648ecf2f36b938a50e358d0a841cfa292cf1d512d131dcd1e6e7"} Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.133111 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" podStartSLOduration=0.975543456 podStartE2EDuration="2.133097287s" podCreationTimestamp="2026-03-18 15:54:00 +0000 UTC" firstStartedPulling="2026-03-18 15:54:00.693875789 +0000 UTC m=+1005.293063430" lastFinishedPulling="2026-03-18 15:54:01.85142964 +0000 UTC m=+1006.450617261" observedRunningTime="2026-03-18 15:54:02.129927356 +0000 UTC m=+1006.729114977" watchObservedRunningTime="2026-03-18 15:54:02.133097287 +0000 UTC m=+1006.732284908" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.148087 4939 scope.go:117] "RemoveContainer" containerID="2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.158103 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.158134 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q6kcd"] Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.172168 4939 scope.go:117] "RemoveContainer" containerID="bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.199318 4939 scope.go:117] "RemoveContainer" containerID="b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa" Mar 18 15:54:02 crc kubenswrapper[4939]: E0318 15:54:02.199882 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa\": container with ID starting with b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa not found: ID does not exist" containerID="b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.199915 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa"} err="failed to get container status \"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa\": rpc error: code = NotFound desc = could not find container \"b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa\": container with ID starting with b92e80fd79038cc750eed2ec605c66ee8bc4a25446696bbd28dede75eccf4bfa not found: ID does not exist" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.199940 4939 scope.go:117] "RemoveContainer" containerID="2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831" Mar 18 15:54:02 crc kubenswrapper[4939]: E0318 15:54:02.200268 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831\": container with ID starting with 2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831 not found: ID does not exist" containerID="2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.200299 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831"} err="failed to get container status \"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831\": rpc error: code = NotFound desc = could not find container \"2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831\": container with ID starting with 2fc51eceec49e6396239b32a4e3ba7e7301335d5834cb6ee1be4b1f4f9756831 not found: ID does not exist" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.200315 4939 scope.go:117] "RemoveContainer" containerID="bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799" Mar 18 15:54:02 crc kubenswrapper[4939]: E0318 15:54:02.200514 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799\": container with ID starting with bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799 not found: ID does not exist" containerID="bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.200535 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799"} err="failed to get container status \"bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799\": rpc error: code = NotFound desc = could not find container \"bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799\": container with ID starting with bd6301d562124fd07b94d35d5696a3b0c368a5f592369b82747d2c2f7e741799 not found: ID does not exist" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.332200 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.522055 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util\") pod \"fed45129-6967-491d-9458-9480359e655d\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.522479 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgf2\" (UniqueName: \"kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2\") pod \"fed45129-6967-491d-9458-9480359e655d\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.522542 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle\") pod \"fed45129-6967-491d-9458-9480359e655d\" (UID: \"fed45129-6967-491d-9458-9480359e655d\") " Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.523094 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle" (OuterVolumeSpecName: "bundle") pod "fed45129-6967-491d-9458-9480359e655d" (UID: "fed45129-6967-491d-9458-9480359e655d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.523361 4939 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.527189 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2" (OuterVolumeSpecName: "kube-api-access-xsgf2") pod "fed45129-6967-491d-9458-9480359e655d" (UID: "fed45129-6967-491d-9458-9480359e655d"). InnerVolumeSpecName "kube-api-access-xsgf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.532120 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util" (OuterVolumeSpecName: "util") pod "fed45129-6967-491d-9458-9480359e655d" (UID: "fed45129-6967-491d-9458-9480359e655d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.624279 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgf2\" (UniqueName: \"kubernetes.io/projected/fed45129-6967-491d-9458-9480359e655d-kube-api-access-xsgf2\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.624337 4939 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed45129-6967-491d-9458-9480359e655d-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:02 crc kubenswrapper[4939]: I0318 15:54:02.711270 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds2wh" Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.126146 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.126182 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn" event={"ID":"fed45129-6967-491d-9458-9480359e655d","Type":"ContainerDied","Data":"2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a"} Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.126225 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6574c271c3f5c8287a4b087f823b7410172febc776bbda48d40decd496ec8a" Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.128157 4939 generic.go:334] "Generic (PLEG): container finished" podID="fe0153aa-53a2-47f6-9aa8-91d1dde946ec" containerID="bf9f56148ff4648ecf2f36b938a50e358d0a841cfa292cf1d512d131dcd1e6e7" exitCode=0 Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.128178 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" event={"ID":"fe0153aa-53a2-47f6-9aa8-91d1dde946ec","Type":"ContainerDied","Data":"bf9f56148ff4648ecf2f36b938a50e358d0a841cfa292cf1d512d131dcd1e6e7"} Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.465375 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.465420 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:03 crc kubenswrapper[4939]: I0318 15:54:03.520027 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.146692 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" path="/var/lib/kubelet/pods/593d40be-7c2a-4c06-8dbf-8d2c78bf66b6/volumes" Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.226004 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.400576 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.446717 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n798r\" (UniqueName: \"kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r\") pod \"fe0153aa-53a2-47f6-9aa8-91d1dde946ec\" (UID: \"fe0153aa-53a2-47f6-9aa8-91d1dde946ec\") " Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.454963 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r" (OuterVolumeSpecName: "kube-api-access-n798r") pod "fe0153aa-53a2-47f6-9aa8-91d1dde946ec" (UID: "fe0153aa-53a2-47f6-9aa8-91d1dde946ec"). InnerVolumeSpecName "kube-api-access-n798r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:04 crc kubenswrapper[4939]: I0318 15:54:04.547454 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n798r\" (UniqueName: \"kubernetes.io/projected/fe0153aa-53a2-47f6-9aa8-91d1dde946ec-kube-api-access-n798r\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:05 crc kubenswrapper[4939]: I0318 15:54:05.153821 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" event={"ID":"fe0153aa-53a2-47f6-9aa8-91d1dde946ec","Type":"ContainerDied","Data":"7465171069aa12dcf83ff1e7bb1bb002913e42659d94919be12f0e9f33a3517e"} Mar 18 15:54:05 crc kubenswrapper[4939]: I0318 15:54:05.153888 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7465171069aa12dcf83ff1e7bb1bb002913e42659d94919be12f0e9f33a3517e" Mar 18 15:54:05 crc kubenswrapper[4939]: I0318 15:54:05.153843 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-7qb2m" Mar 18 15:54:05 crc kubenswrapper[4939]: I0318 15:54:05.212055 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-w7npf"] Mar 18 15:54:05 crc kubenswrapper[4939]: I0318 15:54:05.219231 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-w7npf"] Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.131955 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.143568 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49714dd1-eccd-480b-836b-29c1e2c6eb83" path="/var/lib/kubelet/pods/49714dd1-eccd-480b-836b-29c1e2c6eb83/volumes" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.161497 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ff5wj" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="registry-server" containerID="cri-o://ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb" gracePeriod=2 Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.601592 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-24wzk"] Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.601978 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="pull" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.601989 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="pull" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.601997 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="extract" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602003 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="extract" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.602010 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="registry-server" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602016 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="registry-server" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.602030 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="extract-utilities" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602036 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="extract-utilities" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.602043 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="extract-content" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602049 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="extract-content" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.602057 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="util" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602063 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="util" Mar 18 15:54:06 crc kubenswrapper[4939]: E0318 15:54:06.602073 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0153aa-53a2-47f6-9aa8-91d1dde946ec" containerName="oc" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602079 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0153aa-53a2-47f6-9aa8-91d1dde946ec" containerName="oc" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602165 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="593d40be-7c2a-4c06-8dbf-8d2c78bf66b6" containerName="registry-server" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602173 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0153aa-53a2-47f6-9aa8-91d1dde946ec" containerName="oc" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602182 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed45129-6967-491d-9458-9480359e655d" containerName="extract" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.602552 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.604116 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.605150 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.605328 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-s9n9r" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.610892 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.613285 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-24wzk"] Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.771920 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content\") pod \"64b02721-4602-47b1-876d-68d8a2ff1209\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.772035 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsfjx\" (UniqueName: \"kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx\") pod \"64b02721-4602-47b1-876d-68d8a2ff1209\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.772170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities\") pod \"64b02721-4602-47b1-876d-68d8a2ff1209\" (UID: \"64b02721-4602-47b1-876d-68d8a2ff1209\") " Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.772376 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7th8v\" (UniqueName: \"kubernetes.io/projected/e9429399-6579-4e54-a804-31a2fda4e887-kube-api-access-7th8v\") pod \"nmstate-operator-796d4cfff4-24wzk\" (UID: \"e9429399-6579-4e54-a804-31a2fda4e887\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.773138 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities" (OuterVolumeSpecName: "utilities") pod "64b02721-4602-47b1-876d-68d8a2ff1209" (UID: "64b02721-4602-47b1-876d-68d8a2ff1209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.777869 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx" (OuterVolumeSpecName: "kube-api-access-jsfjx") pod "64b02721-4602-47b1-876d-68d8a2ff1209" (UID: "64b02721-4602-47b1-876d-68d8a2ff1209"). InnerVolumeSpecName "kube-api-access-jsfjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.799209 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64b02721-4602-47b1-876d-68d8a2ff1209" (UID: "64b02721-4602-47b1-876d-68d8a2ff1209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.873374 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7th8v\" (UniqueName: \"kubernetes.io/projected/e9429399-6579-4e54-a804-31a2fda4e887-kube-api-access-7th8v\") pod \"nmstate-operator-796d4cfff4-24wzk\" (UID: \"e9429399-6579-4e54-a804-31a2fda4e887\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.873455 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.873466 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b02721-4602-47b1-876d-68d8a2ff1209-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.873478 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsfjx\" (UniqueName: \"kubernetes.io/projected/64b02721-4602-47b1-876d-68d8a2ff1209-kube-api-access-jsfjx\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.900624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7th8v\" (UniqueName: \"kubernetes.io/projected/e9429399-6579-4e54-a804-31a2fda4e887-kube-api-access-7th8v\") pod \"nmstate-operator-796d4cfff4-24wzk\" (UID: \"e9429399-6579-4e54-a804-31a2fda4e887\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" Mar 18 15:54:06 crc kubenswrapper[4939]: I0318 15:54:06.921460 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.104078 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-24wzk"] Mar 18 15:54:07 crc kubenswrapper[4939]: W0318 15:54:07.109773 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9429399_6579_4e54_a804_31a2fda4e887.slice/crio-2dbd39132db5bbac84976d006aa842f075ead8366d80b28bb9fca8c4b8e65e94 WatchSource:0}: Error finding container 2dbd39132db5bbac84976d006aa842f075ead8366d80b28bb9fca8c4b8e65e94: Status 404 returned error can't find the container with id 2dbd39132db5bbac84976d006aa842f075ead8366d80b28bb9fca8c4b8e65e94 Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.168179 4939 generic.go:334] "Generic (PLEG): container finished" podID="64b02721-4602-47b1-876d-68d8a2ff1209" containerID="ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb" exitCode=0 Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.168228 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ff5wj" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.168261 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerDied","Data":"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb"} Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.168309 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ff5wj" event={"ID":"64b02721-4602-47b1-876d-68d8a2ff1209","Type":"ContainerDied","Data":"bb64618953f767917ca80c3d0e77dd24f92e339d76bfe0bcb5e36e1dc3fe46d0"} Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.168348 4939 scope.go:117] "RemoveContainer" containerID="ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.169481 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" event={"ID":"e9429399-6579-4e54-a804-31a2fda4e887","Type":"ContainerStarted","Data":"2dbd39132db5bbac84976d006aa842f075ead8366d80b28bb9fca8c4b8e65e94"} Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.193801 4939 scope.go:117] "RemoveContainer" containerID="67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.203666 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.211871 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ff5wj"] Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.227395 4939 scope.go:117] "RemoveContainer" containerID="7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.241042 4939 scope.go:117] "RemoveContainer" containerID="ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb" Mar 18 15:54:07 crc kubenswrapper[4939]: E0318 15:54:07.241555 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb\": container with ID starting with ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb not found: ID does not exist" containerID="ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.241594 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb"} err="failed to get container status \"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb\": rpc error: code = NotFound desc = could not find container \"ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb\": container with ID starting with ae30aca76178d77256bd77baae7ab9fa243fcfdd678b635a93c231043baf2fbb not found: ID does not exist" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.241624 4939 scope.go:117] "RemoveContainer" containerID="67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae" Mar 18 15:54:07 crc kubenswrapper[4939]: E0318 15:54:07.242099 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae\": container with ID starting with 67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae not found: ID does not exist" containerID="67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.242154 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae"} err="failed to get container status \"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae\": rpc error: code = NotFound desc = could not find container \"67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae\": container with ID starting with 67e6900799e4be46db3287a72f6efc05a292849f9426fc86df627a03703690ae not found: ID does not exist" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.242189 4939 scope.go:117] "RemoveContainer" containerID="7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943" Mar 18 15:54:07 crc kubenswrapper[4939]: E0318 15:54:07.242600 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943\": container with ID starting with 7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943 not found: ID does not exist" containerID="7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943" Mar 18 15:54:07 crc kubenswrapper[4939]: I0318 15:54:07.242654 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943"} err="failed to get container status \"7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943\": rpc error: code = NotFound desc = could not find container \"7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943\": container with ID starting with 7e8b8a7291705ac2efe141d30c93e97a5d902d26cf8be27ae6ff79fffba04943 not found: ID does not exist" Mar 18 15:54:08 crc kubenswrapper[4939]: I0318 15:54:08.142687 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" path="/var/lib/kubelet/pods/64b02721-4602-47b1-876d-68d8a2ff1209/volumes" Mar 18 15:54:08 crc kubenswrapper[4939]: I0318 15:54:08.268335 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:08 crc kubenswrapper[4939]: I0318 15:54:08.268398 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:08 crc kubenswrapper[4939]: I0318 15:54:08.323850 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:09 crc kubenswrapper[4939]: I0318 15:54:09.243286 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:10 crc kubenswrapper[4939]: I0318 15:54:10.191718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" event={"ID":"e9429399-6579-4e54-a804-31a2fda4e887","Type":"ContainerStarted","Data":"2992f229a94c9e7280f1c8f17e0152720f4222f346c5ee38581779b3d0bb3679"} Mar 18 15:54:10 crc kubenswrapper[4939]: I0318 15:54:10.213205 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-24wzk" podStartSLOduration=2.244534992 podStartE2EDuration="4.213183096s" podCreationTimestamp="2026-03-18 15:54:06 +0000 UTC" firstStartedPulling="2026-03-18 15:54:07.11361032 +0000 UTC m=+1011.712797951" lastFinishedPulling="2026-03-18 15:54:09.082258434 +0000 UTC m=+1013.681446055" observedRunningTime="2026-03-18 15:54:10.209139079 +0000 UTC m=+1014.808326700" watchObservedRunningTime="2026-03-18 15:54:10.213183096 +0000 UTC m=+1014.812370737" Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.524268 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.524960 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzq4c" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="registry-server" containerID="cri-o://9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1" gracePeriod=2 Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.882196 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.944150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities\") pod \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.944367 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content\") pod \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.945612 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w9td\" (UniqueName: \"kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td\") pod \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\" (UID: \"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb\") " Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.945067 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities" (OuterVolumeSpecName: "utilities") pod "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" (UID: "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:12 crc kubenswrapper[4939]: I0318 15:54:12.952656 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td" (OuterVolumeSpecName: "kube-api-access-5w9td") pod "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" (UID: "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb"). InnerVolumeSpecName "kube-api-access-5w9td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.048282 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.048318 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w9td\" (UniqueName: \"kubernetes.io/projected/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-kube-api-access-5w9td\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.109139 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" (UID: "306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.149016 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.212717 4939 generic.go:334] "Generic (PLEG): container finished" podID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerID="9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1" exitCode=0 Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.212782 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerDied","Data":"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1"} Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.212839 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzq4c" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.212867 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzq4c" event={"ID":"306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb","Type":"ContainerDied","Data":"b69ec966ede8f804ae44d0de82f53d501f0034e8ba560d1d6c90be61ba164df5"} Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.212887 4939 scope.go:117] "RemoveContainer" containerID="9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.237479 4939 scope.go:117] "RemoveContainer" containerID="46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.268108 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.276335 4939 scope.go:117] "RemoveContainer" containerID="3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.278540 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzq4c"] Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.298293 4939 scope.go:117] "RemoveContainer" containerID="9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1" Mar 18 15:54:13 crc kubenswrapper[4939]: E0318 15:54:13.298948 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1\": container with ID starting with 9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1 not found: ID does not exist" containerID="9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.298985 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1"} err="failed to get container status \"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1\": rpc error: code = NotFound desc = could not find container \"9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1\": container with ID starting with 9588453f1bf2a19e52589576a350cbeee816acd55711ce6c1e9ded32707996f1 not found: ID does not exist" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.299014 4939 scope.go:117] "RemoveContainer" containerID="46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e" Mar 18 15:54:13 crc kubenswrapper[4939]: E0318 15:54:13.299488 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e\": container with ID starting with 46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e not found: ID does not exist" containerID="46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.299571 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e"} err="failed to get container status \"46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e\": rpc error: code = NotFound desc = could not find container \"46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e\": container with ID starting with 46912f8b885e1bc7aa37e1880e03cee25ba4e40913af3fb7a10d6b6cf93a3e9e not found: ID does not exist" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.299608 4939 scope.go:117] "RemoveContainer" containerID="3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3" Mar 18 15:54:13 crc kubenswrapper[4939]: E0318 15:54:13.300185 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3\": container with ID starting with 3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3 not found: ID does not exist" containerID="3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3" Mar 18 15:54:13 crc kubenswrapper[4939]: I0318 15:54:13.300213 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3"} err="failed to get container status \"3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3\": rpc error: code = NotFound desc = could not find container \"3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3\": container with ID starting with 3af3bdb7510e83f7b52a68f13421a81b6a9513247574a5bd7e93b0f2c083b9f3 not found: ID does not exist" Mar 18 15:54:14 crc kubenswrapper[4939]: I0318 15:54:14.142938 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" path="/var/lib/kubelet/pods/306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb/volumes" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.740813 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741128 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="extract-utilities" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741150 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="extract-utilities" Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741166 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="extract-content" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741178 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="extract-content" Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741200 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741211 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741228 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741239 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741256 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="extract-utilities" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741269 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="extract-utilities" Mar 18 15:54:15 crc kubenswrapper[4939]: E0318 15:54:15.741283 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="extract-content" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741293 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="extract-content" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741439 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b02721-4602-47b1-876d-68d8a2ff1209" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.741464 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="306de6d4-1c3f-4f09-8a9c-8d38e5ce6cdb" containerName="registry-server" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.742616 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.754342 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.791901 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf4ll\" (UniqueName: \"kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.792270 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.792315 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.893224 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf4ll\" (UniqueName: \"kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.893279 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.893309 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.893784 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.893997 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:15 crc kubenswrapper[4939]: I0318 15:54:15.914888 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf4ll\" (UniqueName: \"kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll\") pod \"community-operators-rsrc5\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.062785 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.159963 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.160776 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.164363 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-f2564" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.188130 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-k5fml"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.189796 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.200196 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.201282 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjtv\" (UniqueName: \"kubernetes.io/projected/457e3dd2-84a3-47b0-a023-affddd9bd954-kube-api-access-qdjtv\") pod \"nmstate-metrics-9b8c8685d-q7vcf\" (UID: \"457e3dd2-84a3-47b0-a023-affddd9bd954\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.205444 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.222688 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-k5fml"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.233494 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7jt69"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.242571 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305448 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305495 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-nmstate-lock\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305567 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjtv\" (UniqueName: \"kubernetes.io/projected/457e3dd2-84a3-47b0-a023-affddd9bd954-kube-api-access-qdjtv\") pod \"nmstate-metrics-9b8c8685d-q7vcf\" (UID: \"457e3dd2-84a3-47b0-a023-affddd9bd954\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305591 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snz4n\" (UniqueName: \"kubernetes.io/projected/9c36ff55-800c-42e3-918e-f73cbd98e252-kube-api-access-snz4n\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305629 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntp5\" (UniqueName: \"kubernetes.io/projected/0f293932-41a4-4e41-9fcd-019c097522ff-kube-api-access-sntp5\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305648 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-ovs-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.305666 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-dbus-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.342451 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjtv\" (UniqueName: \"kubernetes.io/projected/457e3dd2-84a3-47b0-a023-affddd9bd954-kube-api-access-qdjtv\") pod \"nmstate-metrics-9b8c8685d-q7vcf\" (UID: \"457e3dd2-84a3-47b0-a023-affddd9bd954\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.344057 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.344689 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.348061 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-g6lbp" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.348259 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.348378 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.355386 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407172 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407238 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntp5\" (UniqueName: \"kubernetes.io/projected/0f293932-41a4-4e41-9fcd-019c097522ff-kube-api-access-sntp5\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407265 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-ovs-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407293 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-dbus-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407329 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407367 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407388 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-nmstate-lock\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407416 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqtmd\" (UniqueName: \"kubernetes.io/projected/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-kube-api-access-xqtmd\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407473 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snz4n\" (UniqueName: \"kubernetes.io/projected/9c36ff55-800c-42e3-918e-f73cbd98e252-kube-api-access-snz4n\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: E0318 15:54:16.407899 4939 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 15:54:16 crc kubenswrapper[4939]: E0318 15:54:16.407953 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair podName:9c36ff55-800c-42e3-918e-f73cbd98e252 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:16.907934308 +0000 UTC m=+1021.507121929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair") pod "nmstate-webhook-5f558f5558-k5fml" (UID: "9c36ff55-800c-42e3-918e-f73cbd98e252") : secret "openshift-nmstate-webhook" not found Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.407980 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-dbus-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.408134 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-nmstate-lock\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.408191 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0f293932-41a4-4e41-9fcd-019c097522ff-ovs-socket\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.431055 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.448276 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snz4n\" (UniqueName: \"kubernetes.io/projected/9c36ff55-800c-42e3-918e-f73cbd98e252-kube-api-access-snz4n\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.448587 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntp5\" (UniqueName: \"kubernetes.io/projected/0f293932-41a4-4e41-9fcd-019c097522ff-kube-api-access-sntp5\") pod \"nmstate-handler-7jt69\" (UID: \"0f293932-41a4-4e41-9fcd-019c097522ff\") " pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.498775 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.508139 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqtmd\" (UniqueName: \"kubernetes.io/projected/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-kube-api-access-xqtmd\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.508213 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.508243 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: E0318 15:54:16.508348 4939 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 15:54:16 crc kubenswrapper[4939]: E0318 15:54:16.508394 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert podName:d5a366f8-31e3-4afd-81c7-bb78a39c5ded nodeName:}" failed. No retries permitted until 2026-03-18 15:54:17.00837925 +0000 UTC m=+1021.607566871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-gxhtz" (UID: "d5a366f8-31e3-4afd-81c7-bb78a39c5ded") : secret "plugin-serving-cert" not found Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.509353 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.529432 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqtmd\" (UniqueName: \"kubernetes.io/projected/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-kube-api-access-xqtmd\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.558411 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c7484d85-nfrz6"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.559067 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.573900 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.574310 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7484d85-nfrz6"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-console-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609267 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609316 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-trusted-ca-bundle\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609345 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-oauth-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-oauth-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609381 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-service-ca\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.609403 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbs8p\" (UniqueName: \"kubernetes.io/projected/6fdc601a-4073-42bd-a050-4e02e25dce10-kube-api-access-fbs8p\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: W0318 15:54:16.644611 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f293932_41a4_4e41_9fcd_019c097522ff.slice/crio-0d4355caa630786ab353f3f8253992dce6de6eaa9ffad6f66dab3dcf0b8a8eb1 WatchSource:0}: Error finding container 0d4355caa630786ab353f3f8253992dce6de6eaa9ffad6f66dab3dcf0b8a8eb1: Status 404 returned error can't find the container with id 0d4355caa630786ab353f3f8253992dce6de6eaa9ffad6f66dab3dcf0b8a8eb1 Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.710739 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-console-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711070 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711099 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-trusted-ca-bundle\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711295 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-oauth-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-oauth-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711349 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-service-ca\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711370 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbs8p\" (UniqueName: \"kubernetes.io/projected/6fdc601a-4073-42bd-a050-4e02e25dce10-kube-api-access-fbs8p\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.711910 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-console-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.713763 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-service-ca\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.713791 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-trusted-ca-bundle\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.715771 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fdc601a-4073-42bd-a050-4e02e25dce10-oauth-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.716591 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-oauth-config\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.717263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc601a-4073-42bd-a050-4e02e25dce10-console-serving-cert\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.729175 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbs8p\" (UniqueName: \"kubernetes.io/projected/6fdc601a-4073-42bd-a050-4e02e25dce10-kube-api-access-fbs8p\") pod \"console-6c7484d85-nfrz6\" (UID: \"6fdc601a-4073-42bd-a050-4e02e25dce10\") " pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.733637 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf"] Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.881274 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.913249 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:16 crc kubenswrapper[4939]: I0318 15:54:16.917017 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9c36ff55-800c-42e3-918e-f73cbd98e252-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-k5fml\" (UID: \"9c36ff55-800c-42e3-918e-f73cbd98e252\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.014183 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.017942 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a366f8-31e3-4afd-81c7-bb78a39c5ded-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gxhtz\" (UID: \"d5a366f8-31e3-4afd-81c7-bb78a39c5ded\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.065574 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7484d85-nfrz6"] Mar 18 15:54:17 crc kubenswrapper[4939]: W0318 15:54:17.073572 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fdc601a_4073_42bd_a050_4e02e25dce10.slice/crio-66cf9c9e8669cf95414bccd61d0d4a00053ea6a19a54569872055c46efc8c20b WatchSource:0}: Error finding container 66cf9c9e8669cf95414bccd61d0d4a00053ea6a19a54569872055c46efc8c20b: Status 404 returned error can't find the container with id 66cf9c9e8669cf95414bccd61d0d4a00053ea6a19a54569872055c46efc8c20b Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.141794 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.266954 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.291489 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7jt69" event={"ID":"0f293932-41a4-4e41-9fcd-019c097522ff","Type":"ContainerStarted","Data":"0d4355caa630786ab353f3f8253992dce6de6eaa9ffad6f66dab3dcf0b8a8eb1"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.293142 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7484d85-nfrz6" event={"ID":"6fdc601a-4073-42bd-a050-4e02e25dce10","Type":"ContainerStarted","Data":"ba9451ec7bc0f68222c7b72cc48b89c411858201cd811038eddb3bbd10dc4b24"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.293163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7484d85-nfrz6" event={"ID":"6fdc601a-4073-42bd-a050-4e02e25dce10","Type":"ContainerStarted","Data":"66cf9c9e8669cf95414bccd61d0d4a00053ea6a19a54569872055c46efc8c20b"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.294475 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" event={"ID":"457e3dd2-84a3-47b0-a023-affddd9bd954","Type":"ContainerStarted","Data":"ac3ab8a1639d83d5a20773b4bf7ca657efde50864897ea2bf3683512026cb44a"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.297823 4939 generic.go:334] "Generic (PLEG): container finished" podID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerID="5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445" exitCode=0 Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.297863 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerDied","Data":"5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.297886 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerStarted","Data":"5c5b817b7c672b0f766ec7354133de544c512f6186863cf347d9d1b2b1addd90"} Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.321567 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c7484d85-nfrz6" podStartSLOduration=1.321541232 podStartE2EDuration="1.321541232s" podCreationTimestamp="2026-03-18 15:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:54:17.315379934 +0000 UTC m=+1021.914567575" watchObservedRunningTime="2026-03-18 15:54:17.321541232 +0000 UTC m=+1021.920728883" Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.362522 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-k5fml"] Mar 18 15:54:17 crc kubenswrapper[4939]: I0318 15:54:17.472444 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz"] Mar 18 15:54:17 crc kubenswrapper[4939]: W0318 15:54:17.477380 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a366f8_31e3_4afd_81c7_bb78a39c5ded.slice/crio-4905ea30b461b120574ed35c131bbbc6bf67dba56799791b4752d21df1664204 WatchSource:0}: Error finding container 4905ea30b461b120574ed35c131bbbc6bf67dba56799791b4752d21df1664204: Status 404 returned error can't find the container with id 4905ea30b461b120574ed35c131bbbc6bf67dba56799791b4752d21df1664204 Mar 18 15:54:18 crc kubenswrapper[4939]: I0318 15:54:18.305840 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" event={"ID":"9c36ff55-800c-42e3-918e-f73cbd98e252","Type":"ContainerStarted","Data":"f0669d5bad789b5009660b363d0325fed4e73ecab03675d2c6f98eb98c487e61"} Mar 18 15:54:18 crc kubenswrapper[4939]: I0318 15:54:18.307669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" event={"ID":"d5a366f8-31e3-4afd-81c7-bb78a39c5ded","Type":"ContainerStarted","Data":"4905ea30b461b120574ed35c131bbbc6bf67dba56799791b4752d21df1664204"} Mar 18 15:54:18 crc kubenswrapper[4939]: I0318 15:54:18.310679 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerStarted","Data":"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5"} Mar 18 15:54:19 crc kubenswrapper[4939]: I0318 15:54:19.321306 4939 generic.go:334] "Generic (PLEG): container finished" podID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerID="651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5" exitCode=0 Mar 18 15:54:19 crc kubenswrapper[4939]: I0318 15:54:19.321384 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerDied","Data":"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5"} Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.329266 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7jt69" event={"ID":"0f293932-41a4-4e41-9fcd-019c097522ff","Type":"ContainerStarted","Data":"345d783164ccd2dfc84ba5b756d56f73878d5ec9d39c74020a9d12c9bf8db350"} Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.329656 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.331530 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" event={"ID":"9c36ff55-800c-42e3-918e-f73cbd98e252","Type":"ContainerStarted","Data":"7a317aa6f8dfd6f7d208518a89481504ec02cc32e17ab28da29b97ccbfe04d5a"} Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.331760 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.333286 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" event={"ID":"457e3dd2-84a3-47b0-a023-affddd9bd954","Type":"ContainerStarted","Data":"75710fc970a0cd6e3753741856473f5e1bbd6fad71631089ed9fe33401fe6940"} Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.354783 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7jt69" podStartSLOduration=1.483247163 podStartE2EDuration="4.354765059s" podCreationTimestamp="2026-03-18 15:54:16 +0000 UTC" firstStartedPulling="2026-03-18 15:54:16.654697207 +0000 UTC m=+1021.253884828" lastFinishedPulling="2026-03-18 15:54:19.526215103 +0000 UTC m=+1024.125402724" observedRunningTime="2026-03-18 15:54:20.344905985 +0000 UTC m=+1024.944093606" watchObservedRunningTime="2026-03-18 15:54:20.354765059 +0000 UTC m=+1024.953952670" Mar 18 15:54:20 crc kubenswrapper[4939]: I0318 15:54:20.371656 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" podStartSLOduration=2.227184546 podStartE2EDuration="4.371630507s" podCreationTimestamp="2026-03-18 15:54:16 +0000 UTC" firstStartedPulling="2026-03-18 15:54:17.375343266 +0000 UTC m=+1021.974530897" lastFinishedPulling="2026-03-18 15:54:19.519789237 +0000 UTC m=+1024.118976858" observedRunningTime="2026-03-18 15:54:20.364888582 +0000 UTC m=+1024.964076193" watchObservedRunningTime="2026-03-18 15:54:20.371630507 +0000 UTC m=+1024.970818128" Mar 18 15:54:21 crc kubenswrapper[4939]: I0318 15:54:21.343212 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" event={"ID":"d5a366f8-31e3-4afd-81c7-bb78a39c5ded","Type":"ContainerStarted","Data":"4c4c887c186f447aec28e15954d70765d923409c07ac80193f503c56c2b29906"} Mar 18 15:54:21 crc kubenswrapper[4939]: I0318 15:54:21.347389 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerStarted","Data":"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc"} Mar 18 15:54:21 crc kubenswrapper[4939]: I0318 15:54:21.367202 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gxhtz" podStartSLOduration=2.283968256 podStartE2EDuration="5.367176267s" podCreationTimestamp="2026-03-18 15:54:16 +0000 UTC" firstStartedPulling="2026-03-18 15:54:17.480380481 +0000 UTC m=+1022.079568102" lastFinishedPulling="2026-03-18 15:54:20.563588492 +0000 UTC m=+1025.162776113" observedRunningTime="2026-03-18 15:54:21.356806827 +0000 UTC m=+1025.955994458" watchObservedRunningTime="2026-03-18 15:54:21.367176267 +0000 UTC m=+1025.966363918" Mar 18 15:54:23 crc kubenswrapper[4939]: I0318 15:54:23.372423 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" event={"ID":"457e3dd2-84a3-47b0-a023-affddd9bd954","Type":"ContainerStarted","Data":"04a59d4d429da7d8f69910b09cd8680b853575a06d2fe39d84389c6397f29269"} Mar 18 15:54:23 crc kubenswrapper[4939]: I0318 15:54:23.411152 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rsrc5" podStartSLOduration=5.151411775 podStartE2EDuration="8.411127676s" podCreationTimestamp="2026-03-18 15:54:15 +0000 UTC" firstStartedPulling="2026-03-18 15:54:17.301171103 +0000 UTC m=+1021.900358724" lastFinishedPulling="2026-03-18 15:54:20.560886994 +0000 UTC m=+1025.160074625" observedRunningTime="2026-03-18 15:54:21.391773617 +0000 UTC m=+1025.990961248" watchObservedRunningTime="2026-03-18 15:54:23.411127676 +0000 UTC m=+1028.010315307" Mar 18 15:54:23 crc kubenswrapper[4939]: I0318 15:54:23.413358 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-q7vcf" podStartSLOduration=1.727591552 podStartE2EDuration="7.41334883s" podCreationTimestamp="2026-03-18 15:54:16 +0000 UTC" firstStartedPulling="2026-03-18 15:54:16.738167598 +0000 UTC m=+1021.337355229" lastFinishedPulling="2026-03-18 15:54:22.423924886 +0000 UTC m=+1027.023112507" observedRunningTime="2026-03-18 15:54:23.392974692 +0000 UTC m=+1027.992162323" watchObservedRunningTime="2026-03-18 15:54:23.41334883 +0000 UTC m=+1028.012536461" Mar 18 15:54:23 crc kubenswrapper[4939]: I0318 15:54:23.687224 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:54:23 crc kubenswrapper[4939]: I0318 15:54:23.687295 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.063329 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.063391 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.108469 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.434718 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.600683 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7jt69" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.881924 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.882018 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:26 crc kubenswrapper[4939]: I0318 15:54:26.889155 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:27 crc kubenswrapper[4939]: I0318 15:54:27.134032 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:27 crc kubenswrapper[4939]: I0318 15:54:27.424708 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c7484d85-nfrz6" Mar 18 15:54:27 crc kubenswrapper[4939]: I0318 15:54:27.506598 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.425845 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rsrc5" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="registry-server" containerID="cri-o://7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc" gracePeriod=2 Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.815574 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.881765 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf4ll\" (UniqueName: \"kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll\") pod \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.881862 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content\") pod \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.882019 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities\") pod \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\" (UID: \"b967d15a-5688-4ce0-8d43-7e5d9c57be4c\") " Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.883239 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities" (OuterVolumeSpecName: "utilities") pod "b967d15a-5688-4ce0-8d43-7e5d9c57be4c" (UID: "b967d15a-5688-4ce0-8d43-7e5d9c57be4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.890737 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll" (OuterVolumeSpecName: "kube-api-access-lf4ll") pod "b967d15a-5688-4ce0-8d43-7e5d9c57be4c" (UID: "b967d15a-5688-4ce0-8d43-7e5d9c57be4c"). InnerVolumeSpecName "kube-api-access-lf4ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.984474 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf4ll\" (UniqueName: \"kubernetes.io/projected/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-kube-api-access-lf4ll\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:28 crc kubenswrapper[4939]: I0318 15:54:28.984575 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.435893 4939 generic.go:334] "Generic (PLEG): container finished" podID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerID="7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc" exitCode=0 Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.436230 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerDied","Data":"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc"} Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.436257 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsrc5" event={"ID":"b967d15a-5688-4ce0-8d43-7e5d9c57be4c","Type":"ContainerDied","Data":"5c5b817b7c672b0f766ec7354133de544c512f6186863cf347d9d1b2b1addd90"} Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.436273 4939 scope.go:117] "RemoveContainer" containerID="7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.436372 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsrc5" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.469609 4939 scope.go:117] "RemoveContainer" containerID="651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.493595 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b967d15a-5688-4ce0-8d43-7e5d9c57be4c" (UID: "b967d15a-5688-4ce0-8d43-7e5d9c57be4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.495488 4939 scope.go:117] "RemoveContainer" containerID="5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.519610 4939 scope.go:117] "RemoveContainer" containerID="7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc" Mar 18 15:54:29 crc kubenswrapper[4939]: E0318 15:54:29.520070 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc\": container with ID starting with 7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc not found: ID does not exist" containerID="7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.520111 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc"} err="failed to get container status \"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc\": rpc error: code = NotFound desc = could not find container \"7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc\": container with ID starting with 7a9b17c3716cd01aff4f8e9d7cf8ecf1aa6200dc5ae5f8dd00ee08f3a41e4ecc not found: ID does not exist" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.520135 4939 scope.go:117] "RemoveContainer" containerID="651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5" Mar 18 15:54:29 crc kubenswrapper[4939]: E0318 15:54:29.520417 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5\": container with ID starting with 651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5 not found: ID does not exist" containerID="651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.520457 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5"} err="failed to get container status \"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5\": rpc error: code = NotFound desc = could not find container \"651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5\": container with ID starting with 651be910115ba45229bd646c4aaf8e1938bb15e930e9e853d4e060e5f72ecaf5 not found: ID does not exist" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.520482 4939 scope.go:117] "RemoveContainer" containerID="5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445" Mar 18 15:54:29 crc kubenswrapper[4939]: E0318 15:54:29.520834 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445\": container with ID starting with 5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445 not found: ID does not exist" containerID="5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.520863 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445"} err="failed to get container status \"5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445\": rpc error: code = NotFound desc = could not find container \"5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445\": container with ID starting with 5556db73764f5d6d967c77758b8b6b51365999da2d0c26ceb8c5173361ad9445 not found: ID does not exist" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.594180 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b967d15a-5688-4ce0-8d43-7e5d9c57be4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.791648 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:29 crc kubenswrapper[4939]: I0318 15:54:29.796976 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rsrc5"] Mar 18 15:54:30 crc kubenswrapper[4939]: I0318 15:54:30.145160 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" path="/var/lib/kubelet/pods/b967d15a-5688-4ce0-8d43-7e5d9c57be4c/volumes" Mar 18 15:54:37 crc kubenswrapper[4939]: I0318 15:54:37.148216 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-k5fml" Mar 18 15:54:37 crc kubenswrapper[4939]: I0318 15:54:37.793836 4939 scope.go:117] "RemoveContainer" containerID="ad94198ca87be1d20d9c756385962f09f0ca08a813c461d3b89ff528e7b56e0f" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.212913 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq"] Mar 18 15:54:50 crc kubenswrapper[4939]: E0318 15:54:50.213638 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="registry-server" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.213653 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="registry-server" Mar 18 15:54:50 crc kubenswrapper[4939]: E0318 15:54:50.213680 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="extract-utilities" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.213688 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="extract-utilities" Mar 18 15:54:50 crc kubenswrapper[4939]: E0318 15:54:50.213718 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="extract-content" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.213734 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="extract-content" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.213868 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b967d15a-5688-4ce0-8d43-7e5d9c57be4c" containerName="registry-server" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.214784 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.217426 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.224428 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq"] Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.380644 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.380699 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55mc\" (UniqueName: \"kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.380796 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.482092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.482205 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.482276 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55mc\" (UniqueName: \"kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.482978 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.483059 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.522254 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55mc\" (UniqueName: \"kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:50 crc kubenswrapper[4939]: I0318 15:54:50.574104 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:51 crc kubenswrapper[4939]: I0318 15:54:51.011573 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq"] Mar 18 15:54:51 crc kubenswrapper[4939]: I0318 15:54:51.596533 4939 generic.go:334] "Generic (PLEG): container finished" podID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerID="990f3d2807c3955843988d418a3d8019ba7095bdf5b85e3e0f28c9bf64552945" exitCode=0 Mar 18 15:54:51 crc kubenswrapper[4939]: I0318 15:54:51.596740 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" event={"ID":"b7aa7c31-17c0-4b52-a694-bb74e34749a3","Type":"ContainerDied","Data":"990f3d2807c3955843988d418a3d8019ba7095bdf5b85e3e0f28c9bf64552945"} Mar 18 15:54:51 crc kubenswrapper[4939]: I0318 15:54:51.597769 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" event={"ID":"b7aa7c31-17c0-4b52-a694-bb74e34749a3","Type":"ContainerStarted","Data":"a175c21656dcc6d85107484d887639c6a35afc6d66bb4d925ff8adbf70289126"} Mar 18 15:54:52 crc kubenswrapper[4939]: I0318 15:54:52.576264 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nqdwc" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerName="console" containerID="cri-o://4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52" gracePeriod=15 Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.011251 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nqdwc_7a2bfef5-fef2-4e27-9749-53ea69f13c0f/console/0.log" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.011686 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116002 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116114 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116187 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116236 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cpzz\" (UniqueName: \"kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116288 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116334 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116401 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca\") pod \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\" (UID: \"7a2bfef5-fef2-4e27-9749-53ea69f13c0f\") " Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.116852 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.117521 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.117688 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config" (OuterVolumeSpecName: "console-config") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.117912 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.122091 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.122688 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.122693 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz" (OuterVolumeSpecName: "kube-api-access-8cpzz") pod "7a2bfef5-fef2-4e27-9749-53ea69f13c0f" (UID: "7a2bfef5-fef2-4e27-9749-53ea69f13c0f"). InnerVolumeSpecName "kube-api-access-8cpzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218266 4939 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218430 4939 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218637 4939 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218678 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cpzz\" (UniqueName: \"kubernetes.io/projected/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-kube-api-access-8cpzz\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218703 4939 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218728 4939 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.218756 4939 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a2bfef5-fef2-4e27-9749-53ea69f13c0f-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636731 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nqdwc_7a2bfef5-fef2-4e27-9749-53ea69f13c0f/console/0.log" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636821 4939 generic.go:334] "Generic (PLEG): container finished" podID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerID="4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52" exitCode=2 Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636917 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nqdwc" event={"ID":"7a2bfef5-fef2-4e27-9749-53ea69f13c0f","Type":"ContainerDied","Data":"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52"} Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636942 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nqdwc" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636978 4939 scope.go:117] "RemoveContainer" containerID="4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.636959 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nqdwc" event={"ID":"7a2bfef5-fef2-4e27-9749-53ea69f13c0f","Type":"ContainerDied","Data":"f83fa5e0600882622f51c8118aad18f38b86beb089f8424f40b48b71e7ce0d15"} Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.647083 4939 generic.go:334] "Generic (PLEG): container finished" podID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerID="13f62d99da66a5e0ee5d75e1dad14ca7651296115b5117a1f86279e404851302" exitCode=0 Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.647163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" event={"ID":"b7aa7c31-17c0-4b52-a694-bb74e34749a3","Type":"ContainerDied","Data":"13f62d99da66a5e0ee5d75e1dad14ca7651296115b5117a1f86279e404851302"} Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.671983 4939 scope.go:117] "RemoveContainer" containerID="4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52" Mar 18 15:54:53 crc kubenswrapper[4939]: E0318 15:54:53.673123 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52\": container with ID starting with 4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52 not found: ID does not exist" containerID="4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.673212 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52"} err="failed to get container status \"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52\": rpc error: code = NotFound desc = could not find container \"4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52\": container with ID starting with 4e3d0ae3f7b8c595d338dee1c7be9200e60b63a29ab5ee15fdcaf50d135c3e52 not found: ID does not exist" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.687214 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.687278 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.700706 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:54:53 crc kubenswrapper[4939]: I0318 15:54:53.709128 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nqdwc"] Mar 18 15:54:54 crc kubenswrapper[4939]: I0318 15:54:54.143352 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" path="/var/lib/kubelet/pods/7a2bfef5-fef2-4e27-9749-53ea69f13c0f/volumes" Mar 18 15:54:54 crc kubenswrapper[4939]: I0318 15:54:54.658385 4939 generic.go:334] "Generic (PLEG): container finished" podID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerID="280edf6860c3868eb8c1470ed9df264ef04af621be05d68a4ad600c6343ebbe6" exitCode=0 Mar 18 15:54:54 crc kubenswrapper[4939]: I0318 15:54:54.658419 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" event={"ID":"b7aa7c31-17c0-4b52-a694-bb74e34749a3","Type":"ContainerDied","Data":"280edf6860c3868eb8c1470ed9df264ef04af621be05d68a4ad600c6343ebbe6"} Mar 18 15:54:55 crc kubenswrapper[4939]: I0318 15:54:55.913872 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.057976 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle\") pod \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.058236 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util\") pod \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.058434 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v55mc\" (UniqueName: \"kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc\") pod \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\" (UID: \"b7aa7c31-17c0-4b52-a694-bb74e34749a3\") " Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.059672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle" (OuterVolumeSpecName: "bundle") pod "b7aa7c31-17c0-4b52-a694-bb74e34749a3" (UID: "b7aa7c31-17c0-4b52-a694-bb74e34749a3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.064563 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc" (OuterVolumeSpecName: "kube-api-access-v55mc") pod "b7aa7c31-17c0-4b52-a694-bb74e34749a3" (UID: "b7aa7c31-17c0-4b52-a694-bb74e34749a3"). InnerVolumeSpecName "kube-api-access-v55mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.071727 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util" (OuterVolumeSpecName: "util") pod "b7aa7c31-17c0-4b52-a694-bb74e34749a3" (UID: "b7aa7c31-17c0-4b52-a694-bb74e34749a3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.160161 4939 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.160194 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v55mc\" (UniqueName: \"kubernetes.io/projected/b7aa7c31-17c0-4b52-a694-bb74e34749a3-kube-api-access-v55mc\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.160204 4939 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7aa7c31-17c0-4b52-a694-bb74e34749a3-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.674746 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" event={"ID":"b7aa7c31-17c0-4b52-a694-bb74e34749a3","Type":"ContainerDied","Data":"a175c21656dcc6d85107484d887639c6a35afc6d66bb4d925ff8adbf70289126"} Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.674795 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a175c21656dcc6d85107484d887639c6a35afc6d66bb4d925ff8adbf70289126" Mar 18 15:54:56 crc kubenswrapper[4939]: I0318 15:54:56.674815 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518080 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl"] Mar 18 15:55:05 crc kubenswrapper[4939]: E0318 15:55:05.518747 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerName="console" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518759 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerName="console" Mar 18 15:55:05 crc kubenswrapper[4939]: E0318 15:55:05.518767 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="extract" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518772 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="extract" Mar 18 15:55:05 crc kubenswrapper[4939]: E0318 15:55:05.518786 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="util" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518792 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="util" Mar 18 15:55:05 crc kubenswrapper[4939]: E0318 15:55:05.518804 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="pull" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518809 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="pull" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518911 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7aa7c31-17c0-4b52-a694-bb74e34749a3" containerName="extract" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.518925 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2bfef5-fef2-4e27-9749-53ea69f13c0f" containerName="console" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.519323 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.524686 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.525028 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.525668 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.525676 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.525743 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5bh54" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.542237 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl"] Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.608927 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4txx\" (UniqueName: \"kubernetes.io/projected/50ba9fec-3821-4c3d-b45d-560924327333-kube-api-access-h4txx\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.608984 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-webhook-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.609033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.710265 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4txx\" (UniqueName: \"kubernetes.io/projected/50ba9fec-3821-4c3d-b45d-560924327333-kube-api-access-h4txx\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.710550 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-webhook-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.710667 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.717420 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.727986 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4txx\" (UniqueName: \"kubernetes.io/projected/50ba9fec-3821-4c3d-b45d-560924327333-kube-api-access-h4txx\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.729602 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50ba9fec-3821-4c3d-b45d-560924327333-webhook-cert\") pod \"metallb-operator-controller-manager-5f4f95c589-d4mxl\" (UID: \"50ba9fec-3821-4c3d-b45d-560924327333\") " pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.735751 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks"] Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.736719 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.741066 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.741118 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.742235 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5wndp" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.755354 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks"] Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.811402 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-apiservice-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.811459 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-webhook-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.811491 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblqd\" (UniqueName: \"kubernetes.io/projected/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-kube-api-access-pblqd\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.832848 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.912921 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-apiservice-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.913281 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-webhook-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.913321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblqd\" (UniqueName: \"kubernetes.io/projected/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-kube-api-access-pblqd\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.916982 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-apiservice-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.921007 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-webhook-cert\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:05 crc kubenswrapper[4939]: I0318 15:55:05.933938 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblqd\" (UniqueName: \"kubernetes.io/projected/cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7-kube-api-access-pblqd\") pod \"metallb-operator-webhook-server-6c79d7b54b-47sks\" (UID: \"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7\") " pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:06 crc kubenswrapper[4939]: I0318 15:55:06.023444 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl"] Mar 18 15:55:06 crc kubenswrapper[4939]: I0318 15:55:06.080173 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:06 crc kubenswrapper[4939]: I0318 15:55:06.326437 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks"] Mar 18 15:55:06 crc kubenswrapper[4939]: W0318 15:55:06.332244 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcff68ac1_e2a3_4bbe_b617_0f8bf923e0e7.slice/crio-50c31768fd5eb785f7d1c1c793f0f0ab120f73cf4ef7f64174225afcc2a7cdde WatchSource:0}: Error finding container 50c31768fd5eb785f7d1c1c793f0f0ab120f73cf4ef7f64174225afcc2a7cdde: Status 404 returned error can't find the container with id 50c31768fd5eb785f7d1c1c793f0f0ab120f73cf4ef7f64174225afcc2a7cdde Mar 18 15:55:06 crc kubenswrapper[4939]: I0318 15:55:06.729960 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" event={"ID":"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7","Type":"ContainerStarted","Data":"50c31768fd5eb785f7d1c1c793f0f0ab120f73cf4ef7f64174225afcc2a7cdde"} Mar 18 15:55:06 crc kubenswrapper[4939]: I0318 15:55:06.731972 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" event={"ID":"50ba9fec-3821-4c3d-b45d-560924327333","Type":"ContainerStarted","Data":"501946c6f27046e34cfa0cb910b71c5bcd8deda0548c46432cbb47ec1561dfd5"} Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.768166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" event={"ID":"50ba9fec-3821-4c3d-b45d-560924327333","Type":"ContainerStarted","Data":"6005643e6f9cee6cadc6160ae73ab2dd7b091d0ceecbca015da6f06af334bb79"} Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.768901 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.770220 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" event={"ID":"cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7","Type":"ContainerStarted","Data":"9b733a57e42942e7d580701c350802329ba42ef954004f3d3d3b8cd1dfe05a19"} Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.770537 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.803776 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" podStartSLOduration=2.05074648 podStartE2EDuration="7.80374416s" podCreationTimestamp="2026-03-18 15:55:05 +0000 UTC" firstStartedPulling="2026-03-18 15:55:06.033641916 +0000 UTC m=+1070.632829537" lastFinishedPulling="2026-03-18 15:55:11.786639596 +0000 UTC m=+1076.385827217" observedRunningTime="2026-03-18 15:55:12.795079889 +0000 UTC m=+1077.394267540" watchObservedRunningTime="2026-03-18 15:55:12.80374416 +0000 UTC m=+1077.402931801" Mar 18 15:55:12 crc kubenswrapper[4939]: I0318 15:55:12.831155 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" podStartSLOduration=2.37439451 podStartE2EDuration="7.831136161s" podCreationTimestamp="2026-03-18 15:55:05 +0000 UTC" firstStartedPulling="2026-03-18 15:55:06.336333761 +0000 UTC m=+1070.935521392" lastFinishedPulling="2026-03-18 15:55:11.793075432 +0000 UTC m=+1076.392263043" observedRunningTime="2026-03-18 15:55:12.830456441 +0000 UTC m=+1077.429644082" watchObservedRunningTime="2026-03-18 15:55:12.831136161 +0000 UTC m=+1077.430323782" Mar 18 15:55:23 crc kubenswrapper[4939]: I0318 15:55:23.687342 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:55:23 crc kubenswrapper[4939]: I0318 15:55:23.687718 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:55:23 crc kubenswrapper[4939]: I0318 15:55:23.687770 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:55:23 crc kubenswrapper[4939]: I0318 15:55:23.688293 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:55:23 crc kubenswrapper[4939]: I0318 15:55:23.688351 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2" gracePeriod=600 Mar 18 15:55:24 crc kubenswrapper[4939]: I0318 15:55:24.995483 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2" exitCode=0 Mar 18 15:55:24 crc kubenswrapper[4939]: I0318 15:55:24.995548 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2"} Mar 18 15:55:24 crc kubenswrapper[4939]: I0318 15:55:24.995991 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa"} Mar 18 15:55:24 crc kubenswrapper[4939]: I0318 15:55:24.996018 4939 scope.go:117] "RemoveContainer" containerID="72b32c6e644eabdabe4212d9986a992938a1f92c3ed813591886f43325ae5cf0" Mar 18 15:55:26 crc kubenswrapper[4939]: I0318 15:55:26.085264 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c79d7b54b-47sks" Mar 18 15:55:45 crc kubenswrapper[4939]: I0318 15:55:45.836645 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5f4f95c589-d4mxl" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.514798 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-62xrf"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.517966 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.519077 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.519752 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.519896 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.521853 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.522025 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jc4nk" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.522563 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.543344 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.592254 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jn4nh"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.593289 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.595704 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.595782 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gzhrz" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.595890 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.596054 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.621082 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-2mcg4"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.622156 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.626183 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.633074 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2mcg4"] Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bjh9\" (UniqueName: \"kubernetes.io/projected/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-kube-api-access-9bjh9\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688088 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metallb-excludel2\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688118 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-sockets\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688136 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-startup\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688164 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dffd9132-ad62-4775-926e-c604222dab03-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688321 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688368 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics-certs\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688389 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg9ft\" (UniqueName: \"kubernetes.io/projected/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-kube-api-access-lg9ft\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688412 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-conf\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688426 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x652l\" (UniqueName: \"kubernetes.io/projected/dffd9132-ad62-4775-926e-c604222dab03-kube-api-access-x652l\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688454 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metrics-certs\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688522 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.688553 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-reloader\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789319 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dffd9132-ad62-4775-926e-c604222dab03-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789394 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-metrics-certs\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789437 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789459 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics-certs\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789474 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg9ft\" (UniqueName: \"kubernetes.io/projected/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-kube-api-access-lg9ft\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789492 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x652l\" (UniqueName: \"kubernetes.io/projected/dffd9132-ad62-4775-926e-c604222dab03-kube-api-access-x652l\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789522 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-conf\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789543 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metrics-certs\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789566 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-cert\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789585 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789605 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-reloader\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9sbf\" (UniqueName: \"kubernetes.io/projected/bddd9413-cc2e-4e90-86b1-132dac143d2f-kube-api-access-x9sbf\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789641 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bjh9\" (UniqueName: \"kubernetes.io/projected/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-kube-api-access-9bjh9\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789656 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metallb-excludel2\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789670 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-sockets\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.789684 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-startup\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.790530 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-startup\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.791590 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.791833 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-reloader\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: E0318 15:55:46.791904 4939 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:55:46 crc kubenswrapper[4939]: E0318 15:55:46.791945 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist podName:aa0441fc-cc4b-4e64-ad17-79b9760c51cb nodeName:}" failed. No retries permitted until 2026-03-18 15:55:47.291933113 +0000 UTC m=+1111.891120734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist") pod "speaker-jn4nh" (UID: "aa0441fc-cc4b-4e64-ad17-79b9760c51cb") : secret "metallb-memberlist" not found Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.792282 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metallb-excludel2\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.792327 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-sockets\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.792786 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-frr-conf\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.796141 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-metrics-certs\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.796180 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-metrics-certs\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.797373 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dffd9132-ad62-4775-926e-c604222dab03-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.809350 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg9ft\" (UniqueName: \"kubernetes.io/projected/06ab61f3-deb8-4e62-b8a2-f5b879ab398b-kube-api-access-lg9ft\") pod \"frr-k8s-62xrf\" (UID: \"06ab61f3-deb8-4e62-b8a2-f5b879ab398b\") " pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.814125 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bjh9\" (UniqueName: \"kubernetes.io/projected/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-kube-api-access-9bjh9\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.814397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x652l\" (UniqueName: \"kubernetes.io/projected/dffd9132-ad62-4775-926e-c604222dab03-kube-api-access-x652l\") pod \"frr-k8s-webhook-server-bcc4b6f68-mfplf\" (UID: \"dffd9132-ad62-4775-926e-c604222dab03\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.841764 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.862246 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.892867 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-cert\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.892931 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9sbf\" (UniqueName: \"kubernetes.io/projected/bddd9413-cc2e-4e90-86b1-132dac143d2f-kube-api-access-x9sbf\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.892964 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-metrics-certs\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.895631 4939 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.896320 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-metrics-certs\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.908994 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bddd9413-cc2e-4e90-86b1-132dac143d2f-cert\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.914294 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9sbf\" (UniqueName: \"kubernetes.io/projected/bddd9413-cc2e-4e90-86b1-132dac143d2f-kube-api-access-x9sbf\") pod \"controller-7bb4cc7c98-2mcg4\" (UID: \"bddd9413-cc2e-4e90-86b1-132dac143d2f\") " pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:46 crc kubenswrapper[4939]: I0318 15:55:46.941017 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:47 crc kubenswrapper[4939]: I0318 15:55:47.148610 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2mcg4"] Mar 18 15:55:47 crc kubenswrapper[4939]: W0318 15:55:47.157880 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbddd9413_cc2e_4e90_86b1_132dac143d2f.slice/crio-8b92368a94e65fe066bce57d0d906f828bb0b0a3531d39b57af181ee7b8df00a WatchSource:0}: Error finding container 8b92368a94e65fe066bce57d0d906f828bb0b0a3531d39b57af181ee7b8df00a: Status 404 returned error can't find the container with id 8b92368a94e65fe066bce57d0d906f828bb0b0a3531d39b57af181ee7b8df00a Mar 18 15:55:47 crc kubenswrapper[4939]: I0318 15:55:47.158745 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"90af3a3f8982adf119bca1beb65223d5735181d138d7d5e282dd3b723c079c74"} Mar 18 15:55:47 crc kubenswrapper[4939]: I0318 15:55:47.257766 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf"] Mar 18 15:55:47 crc kubenswrapper[4939]: W0318 15:55:47.260037 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffd9132_ad62_4775_926e_c604222dab03.slice/crio-946c3d67dde90086ef32aa1306bd7d735fce09949c775fef8c7c25b4533fa278 WatchSource:0}: Error finding container 946c3d67dde90086ef32aa1306bd7d735fce09949c775fef8c7c25b4533fa278: Status 404 returned error can't find the container with id 946c3d67dde90086ef32aa1306bd7d735fce09949c775fef8c7c25b4533fa278 Mar 18 15:55:47 crc kubenswrapper[4939]: I0318 15:55:47.298655 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:47 crc kubenswrapper[4939]: E0318 15:55:47.298820 4939 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:55:47 crc kubenswrapper[4939]: E0318 15:55:47.298890 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist podName:aa0441fc-cc4b-4e64-ad17-79b9760c51cb nodeName:}" failed. No retries permitted until 2026-03-18 15:55:48.298875024 +0000 UTC m=+1112.898062645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist") pod "speaker-jn4nh" (UID: "aa0441fc-cc4b-4e64-ad17-79b9760c51cb") : secret "metallb-memberlist" not found Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.167983 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2mcg4" event={"ID":"bddd9413-cc2e-4e90-86b1-132dac143d2f","Type":"ContainerStarted","Data":"31192f1326f024c5794288e5e73eea562fa4624116839c731ae1cbdbaaf90cf8"} Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.168237 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.168250 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2mcg4" event={"ID":"bddd9413-cc2e-4e90-86b1-132dac143d2f","Type":"ContainerStarted","Data":"f102e9486909662ecbcf7e8ab871c5566244f0e4751b10ac284260d33fdab818"} Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.168263 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2mcg4" event={"ID":"bddd9413-cc2e-4e90-86b1-132dac143d2f","Type":"ContainerStarted","Data":"8b92368a94e65fe066bce57d0d906f828bb0b0a3531d39b57af181ee7b8df00a"} Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.169992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" event={"ID":"dffd9132-ad62-4775-926e-c604222dab03","Type":"ContainerStarted","Data":"946c3d67dde90086ef32aa1306bd7d735fce09949c775fef8c7c25b4533fa278"} Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.312382 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.324088 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa0441fc-cc4b-4e64-ad17-79b9760c51cb-memberlist\") pod \"speaker-jn4nh\" (UID: \"aa0441fc-cc4b-4e64-ad17-79b9760c51cb\") " pod="metallb-system/speaker-jn4nh" Mar 18 15:55:48 crc kubenswrapper[4939]: I0318 15:55:48.417940 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jn4nh" Mar 18 15:55:48 crc kubenswrapper[4939]: W0318 15:55:48.447187 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0441fc_cc4b_4e64_ad17_79b9760c51cb.slice/crio-07ec781153f9dc9ca44ab4fed7232fa9ec73d3c67a44a007a429b8df50e293fe WatchSource:0}: Error finding container 07ec781153f9dc9ca44ab4fed7232fa9ec73d3c67a44a007a429b8df50e293fe: Status 404 returned error can't find the container with id 07ec781153f9dc9ca44ab4fed7232fa9ec73d3c67a44a007a429b8df50e293fe Mar 18 15:55:49 crc kubenswrapper[4939]: I0318 15:55:49.183247 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jn4nh" event={"ID":"aa0441fc-cc4b-4e64-ad17-79b9760c51cb","Type":"ContainerStarted","Data":"3e169c70c448c26bcd128149f2146da21ec6b5ca06d4f7b7f0f26b49c4d567f0"} Mar 18 15:55:49 crc kubenswrapper[4939]: I0318 15:55:49.183312 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jn4nh" event={"ID":"aa0441fc-cc4b-4e64-ad17-79b9760c51cb","Type":"ContainerStarted","Data":"379088c7e3d99ce2d19c89478a43a00cf628343512e5bfb66eab9d298f2eee40"} Mar 18 15:55:49 crc kubenswrapper[4939]: I0318 15:55:49.183324 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jn4nh" event={"ID":"aa0441fc-cc4b-4e64-ad17-79b9760c51cb","Type":"ContainerStarted","Data":"07ec781153f9dc9ca44ab4fed7232fa9ec73d3c67a44a007a429b8df50e293fe"} Mar 18 15:55:49 crc kubenswrapper[4939]: I0318 15:55:49.183554 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jn4nh" Mar 18 15:55:49 crc kubenswrapper[4939]: I0318 15:55:49.202882 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-2mcg4" podStartSLOduration=3.20286467 podStartE2EDuration="3.20286467s" podCreationTimestamp="2026-03-18 15:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:55:48.186413032 +0000 UTC m=+1112.785600693" watchObservedRunningTime="2026-03-18 15:55:49.20286467 +0000 UTC m=+1113.802052291" Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.225644 4939 generic.go:334] "Generic (PLEG): container finished" podID="06ab61f3-deb8-4e62-b8a2-f5b879ab398b" containerID="a4e2d0b32e3a4eb6b3de01f9214518997881a20a19dfc15368abc525a7195614" exitCode=0 Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.225707 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerDied","Data":"a4e2d0b32e3a4eb6b3de01f9214518997881a20a19dfc15368abc525a7195614"} Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.227924 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" event={"ID":"dffd9132-ad62-4775-926e-c604222dab03","Type":"ContainerStarted","Data":"34bf84b342706aae228fc70945dd463b1d31a8283f364c3931e7315b80a98e1b"} Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.228096 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.269473 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jn4nh" podStartSLOduration=9.269449482 podStartE2EDuration="9.269449482s" podCreationTimestamp="2026-03-18 15:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:55:49.207275406 +0000 UTC m=+1113.806463037" watchObservedRunningTime="2026-03-18 15:55:55.269449482 +0000 UTC m=+1119.868637143" Mar 18 15:55:55 crc kubenswrapper[4939]: I0318 15:55:55.278019 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" podStartSLOduration=1.884200248 podStartE2EDuration="9.278003325s" podCreationTimestamp="2026-03-18 15:55:46 +0000 UTC" firstStartedPulling="2026-03-18 15:55:47.263874318 +0000 UTC m=+1111.863061939" lastFinishedPulling="2026-03-18 15:55:54.657677375 +0000 UTC m=+1119.256865016" observedRunningTime="2026-03-18 15:55:55.2774655 +0000 UTC m=+1119.876653161" watchObservedRunningTime="2026-03-18 15:55:55.278003325 +0000 UTC m=+1119.877190946" Mar 18 15:55:56 crc kubenswrapper[4939]: I0318 15:55:56.238340 4939 generic.go:334] "Generic (PLEG): container finished" podID="06ab61f3-deb8-4e62-b8a2-f5b879ab398b" containerID="b8303f55287d9af4a73e9f18387f22654e01b3cfb2eef376e5c36d2c6bdd029c" exitCode=0 Mar 18 15:55:56 crc kubenswrapper[4939]: I0318 15:55:56.238567 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerDied","Data":"b8303f55287d9af4a73e9f18387f22654e01b3cfb2eef376e5c36d2c6bdd029c"} Mar 18 15:55:57 crc kubenswrapper[4939]: I0318 15:55:57.246920 4939 generic.go:334] "Generic (PLEG): container finished" podID="06ab61f3-deb8-4e62-b8a2-f5b879ab398b" containerID="4b45b3b0f527f17783c4035106c9a323ec339faa36fd23125d96bbf765073962" exitCode=0 Mar 18 15:55:57 crc kubenswrapper[4939]: I0318 15:55:57.247015 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerDied","Data":"4b45b3b0f527f17783c4035106c9a323ec339faa36fd23125d96bbf765073962"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.260718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"a02f0e93f959b5c6dacd8cacd197e6b1a35c7a8424918dfae07cab6ff9a4002f"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261296 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"71204c43a8bee64b8aca4e4d072f57cfceb2e9def37880b11e400a6531a6d1df"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261311 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"bdbe7d527595ec4b8c82177e4c2384cbe1604e92231fcac73d6d6722935bf43c"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261321 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"958c1cfe7200b0fd403ce8e4b9bfd0ef1f4149e72d74f1f5ad698b8bc39e81f6"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261330 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"e9697a24c3370f9a69530646ac3ec9cc147fd494a5f6885c2ef260431173e27c"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62xrf" event={"ID":"06ab61f3-deb8-4e62-b8a2-f5b879ab398b","Type":"ContainerStarted","Data":"19e9694444fb2eb1386a6c09d501cf56b1213dea8bc752bfd8e473f9c85d3df5"} Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.261367 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.280777 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-62xrf" podStartSLOduration=4.640100948 podStartE2EDuration="12.280763022s" podCreationTimestamp="2026-03-18 15:55:46 +0000 UTC" firstStartedPulling="2026-03-18 15:55:46.996946899 +0000 UTC m=+1111.596134540" lastFinishedPulling="2026-03-18 15:55:54.637608973 +0000 UTC m=+1119.236796614" observedRunningTime="2026-03-18 15:55:58.27859852 +0000 UTC m=+1122.877786161" watchObservedRunningTime="2026-03-18 15:55:58.280763022 +0000 UTC m=+1122.879950643" Mar 18 15:55:58 crc kubenswrapper[4939]: I0318 15:55:58.422147 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jn4nh" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.002655 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp"] Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.004222 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.006735 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.017868 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcd7\" (UniqueName: \"kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.017956 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.017997 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.025871 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp"] Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.118878 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.118952 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.119229 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcd7\" (UniqueName: \"kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.119538 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.119833 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.157979 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8zqqn"] Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.161321 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcd7\" (UniqueName: \"kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.178871 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8zqqn"] Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.178958 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.187712 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.187938 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.188025 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.220936 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5klm\" (UniqueName: \"kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm\") pod \"auto-csr-approver-29564156-8zqqn\" (UID: \"489dd963-1de2-4977-9d76-d9635ee4dc24\") " pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.322339 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5klm\" (UniqueName: \"kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm\") pod \"auto-csr-approver-29564156-8zqqn\" (UID: \"489dd963-1de2-4977-9d76-d9635ee4dc24\") " pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.331257 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.341038 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5klm\" (UniqueName: \"kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm\") pod \"auto-csr-approver-29564156-8zqqn\" (UID: \"489dd963-1de2-4977-9d76-d9635ee4dc24\") " pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.519109 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.733042 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8zqqn"] Mar 18 15:56:00 crc kubenswrapper[4939]: W0318 15:56:00.739624 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod489dd963_1de2_4977_9d76_d9635ee4dc24.slice/crio-55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71 WatchSource:0}: Error finding container 55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71: Status 404 returned error can't find the container with id 55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71 Mar 18 15:56:00 crc kubenswrapper[4939]: I0318 15:56:00.747098 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp"] Mar 18 15:56:00 crc kubenswrapper[4939]: W0318 15:56:00.753710 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0460d1a9_3710_482c_89c5_00dd5c28da89.slice/crio-58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e WatchSource:0}: Error finding container 58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e: Status 404 returned error can't find the container with id 58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.288698 4939 generic.go:334] "Generic (PLEG): container finished" podID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerID="ececc48dd7ad88103d3dc5d7749674ec87f53a6b66baf492fe7bbbe4e35cf488" exitCode=0 Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.288748 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" event={"ID":"0460d1a9-3710-482c-89c5-00dd5c28da89","Type":"ContainerDied","Data":"ececc48dd7ad88103d3dc5d7749674ec87f53a6b66baf492fe7bbbe4e35cf488"} Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.289083 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" event={"ID":"0460d1a9-3710-482c-89c5-00dd5c28da89","Type":"ContainerStarted","Data":"58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e"} Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.290052 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" event={"ID":"489dd963-1de2-4977-9d76-d9635ee4dc24","Type":"ContainerStarted","Data":"55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71"} Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.842157 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:56:01 crc kubenswrapper[4939]: I0318 15:56:01.881852 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:56:02 crc kubenswrapper[4939]: I0318 15:56:02.299494 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" event={"ID":"489dd963-1de2-4977-9d76-d9635ee4dc24","Type":"ContainerStarted","Data":"6c0358b9cc3e41dcf12f335fc25d33f18016832b8c3a3c4c40ffe7889b013fed"} Mar 18 15:56:02 crc kubenswrapper[4939]: I0318 15:56:02.333284 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" podStartSLOduration=1.156152893 podStartE2EDuration="2.333265534s" podCreationTimestamp="2026-03-18 15:56:00 +0000 UTC" firstStartedPulling="2026-03-18 15:56:00.74191552 +0000 UTC m=+1125.341103141" lastFinishedPulling="2026-03-18 15:56:01.919028161 +0000 UTC m=+1126.518215782" observedRunningTime="2026-03-18 15:56:02.332568794 +0000 UTC m=+1126.931756415" watchObservedRunningTime="2026-03-18 15:56:02.333265534 +0000 UTC m=+1126.932453155" Mar 18 15:56:03 crc kubenswrapper[4939]: I0318 15:56:03.306054 4939 generic.go:334] "Generic (PLEG): container finished" podID="489dd963-1de2-4977-9d76-d9635ee4dc24" containerID="6c0358b9cc3e41dcf12f335fc25d33f18016832b8c3a3c4c40ffe7889b013fed" exitCode=0 Mar 18 15:56:03 crc kubenswrapper[4939]: I0318 15:56:03.306137 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" event={"ID":"489dd963-1de2-4977-9d76-d9635ee4dc24","Type":"ContainerDied","Data":"6c0358b9cc3e41dcf12f335fc25d33f18016832b8c3a3c4c40ffe7889b013fed"} Mar 18 15:56:04 crc kubenswrapper[4939]: I0318 15:56:04.631289 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:04 crc kubenswrapper[4939]: I0318 15:56:04.780251 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5klm\" (UniqueName: \"kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm\") pod \"489dd963-1de2-4977-9d76-d9635ee4dc24\" (UID: \"489dd963-1de2-4977-9d76-d9635ee4dc24\") " Mar 18 15:56:04 crc kubenswrapper[4939]: I0318 15:56:04.788641 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm" (OuterVolumeSpecName: "kube-api-access-j5klm") pod "489dd963-1de2-4977-9d76-d9635ee4dc24" (UID: "489dd963-1de2-4977-9d76-d9635ee4dc24"). InnerVolumeSpecName "kube-api-access-j5klm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:04 crc kubenswrapper[4939]: I0318 15:56:04.881601 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5klm\" (UniqueName: \"kubernetes.io/projected/489dd963-1de2-4977-9d76-d9635ee4dc24-kube-api-access-j5klm\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.326926 4939 generic.go:334] "Generic (PLEG): container finished" podID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerID="0170f1b18a9c7f374b16c23648cc0e6987ac0f18e8abfe0201d74ee0e3fde004" exitCode=0 Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.327047 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" event={"ID":"0460d1a9-3710-482c-89c5-00dd5c28da89","Type":"ContainerDied","Data":"0170f1b18a9c7f374b16c23648cc0e6987ac0f18e8abfe0201d74ee0e3fde004"} Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.329334 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" event={"ID":"489dd963-1de2-4977-9d76-d9635ee4dc24","Type":"ContainerDied","Data":"55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71"} Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.329376 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55336df594071074e86ef5c1b8ace23f05f8ca00f16440903c9f1ac361846d71" Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.329411 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-8zqqn" Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.676088 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-7xdb7"] Mar 18 15:56:05 crc kubenswrapper[4939]: I0318 15:56:05.681003 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-7xdb7"] Mar 18 15:56:06 crc kubenswrapper[4939]: I0318 15:56:06.143626 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b" path="/var/lib/kubelet/pods/d34faa20-ad8a-4e51-bc6c-9cf8d3efdf0b/volumes" Mar 18 15:56:06 crc kubenswrapper[4939]: I0318 15:56:06.342182 4939 generic.go:334] "Generic (PLEG): container finished" podID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerID="c68fdc6a01d28cc77f8d92aa711f093440c69c72a8e791274b2e69a3c9c4fefc" exitCode=0 Mar 18 15:56:06 crc kubenswrapper[4939]: I0318 15:56:06.342226 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" event={"ID":"0460d1a9-3710-482c-89c5-00dd5c28da89","Type":"ContainerDied","Data":"c68fdc6a01d28cc77f8d92aa711f093440c69c72a8e791274b2e69a3c9c4fefc"} Mar 18 15:56:06 crc kubenswrapper[4939]: I0318 15:56:06.876298 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-mfplf" Mar 18 15:56:06 crc kubenswrapper[4939]: I0318 15:56:06.945232 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-2mcg4" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.634980 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.821284 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle\") pod \"0460d1a9-3710-482c-89c5-00dd5c28da89\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.821381 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util\") pod \"0460d1a9-3710-482c-89c5-00dd5c28da89\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.821532 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzcd7\" (UniqueName: \"kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7\") pod \"0460d1a9-3710-482c-89c5-00dd5c28da89\" (UID: \"0460d1a9-3710-482c-89c5-00dd5c28da89\") " Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.826915 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle" (OuterVolumeSpecName: "bundle") pod "0460d1a9-3710-482c-89c5-00dd5c28da89" (UID: "0460d1a9-3710-482c-89c5-00dd5c28da89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.844746 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7" (OuterVolumeSpecName: "kube-api-access-fzcd7") pod "0460d1a9-3710-482c-89c5-00dd5c28da89" (UID: "0460d1a9-3710-482c-89c5-00dd5c28da89"). InnerVolumeSpecName "kube-api-access-fzcd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.847062 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util" (OuterVolumeSpecName: "util") pod "0460d1a9-3710-482c-89c5-00dd5c28da89" (UID: "0460d1a9-3710-482c-89c5-00dd5c28da89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.923219 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzcd7\" (UniqueName: \"kubernetes.io/projected/0460d1a9-3710-482c-89c5-00dd5c28da89-kube-api-access-fzcd7\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.923251 4939 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:07 crc kubenswrapper[4939]: I0318 15:56:07.923260 4939 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0460d1a9-3710-482c-89c5-00dd5c28da89-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:08 crc kubenswrapper[4939]: I0318 15:56:08.359096 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" event={"ID":"0460d1a9-3710-482c-89c5-00dd5c28da89","Type":"ContainerDied","Data":"58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e"} Mar 18 15:56:08 crc kubenswrapper[4939]: I0318 15:56:08.359151 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c8d7940f4ea38984784fe104c3e1c729a65871f51032ed9223d0e69db21f3e" Mar 18 15:56:08 crc kubenswrapper[4939]: I0318 15:56:08.359240 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.883532 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f"] Mar 18 15:56:12 crc kubenswrapper[4939]: E0318 15:56:12.883994 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="pull" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884005 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="pull" Mar 18 15:56:12 crc kubenswrapper[4939]: E0318 15:56:12.884013 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="util" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884019 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="util" Mar 18 15:56:12 crc kubenswrapper[4939]: E0318 15:56:12.884031 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489dd963-1de2-4977-9d76-d9635ee4dc24" containerName="oc" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884037 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="489dd963-1de2-4977-9d76-d9635ee4dc24" containerName="oc" Mar 18 15:56:12 crc kubenswrapper[4939]: E0318 15:56:12.884052 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="extract" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884058 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="extract" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884149 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0460d1a9-3710-482c-89c5-00dd5c28da89" containerName="extract" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884168 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="489dd963-1de2-4977-9d76-d9635ee4dc24" containerName="oc" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.884582 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.888177 4939 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-q572z" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.888290 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.888988 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.898409 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.898532 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2lpg\" (UniqueName: \"kubernetes.io/projected/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-kube-api-access-h2lpg\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:12 crc kubenswrapper[4939]: I0318 15:56:12.899048 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f"] Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.000034 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.000124 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2lpg\" (UniqueName: \"kubernetes.io/projected/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-kube-api-access-h2lpg\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.000671 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.020725 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2lpg\" (UniqueName: \"kubernetes.io/projected/0553c207-c9e6-4a10-9ae9-afc6ac09b74c-kube-api-access-h2lpg\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qb52f\" (UID: \"0553c207-c9e6-4a10-9ae9-afc6ac09b74c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.199822 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" Mar 18 15:56:13 crc kubenswrapper[4939]: I0318 15:56:13.614770 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f"] Mar 18 15:56:13 crc kubenswrapper[4939]: W0318 15:56:13.619232 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0553c207_c9e6_4a10_9ae9_afc6ac09b74c.slice/crio-1333de11749e8edbc1737a9862875cd067ccb8c5d95a0f678d274aeab9877520 WatchSource:0}: Error finding container 1333de11749e8edbc1737a9862875cd067ccb8c5d95a0f678d274aeab9877520: Status 404 returned error can't find the container with id 1333de11749e8edbc1737a9862875cd067ccb8c5d95a0f678d274aeab9877520 Mar 18 15:56:14 crc kubenswrapper[4939]: I0318 15:56:14.408090 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" event={"ID":"0553c207-c9e6-4a10-9ae9-afc6ac09b74c","Type":"ContainerStarted","Data":"1333de11749e8edbc1737a9862875cd067ccb8c5d95a0f678d274aeab9877520"} Mar 18 15:56:16 crc kubenswrapper[4939]: I0318 15:56:16.847685 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-62xrf" Mar 18 15:56:17 crc kubenswrapper[4939]: I0318 15:56:17.429193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" event={"ID":"0553c207-c9e6-4a10-9ae9-afc6ac09b74c","Type":"ContainerStarted","Data":"d011b99cf0898cc9800c7afcb8ea1ee5cebd6a1d8def4a7578ce294939ac90f9"} Mar 18 15:56:17 crc kubenswrapper[4939]: I0318 15:56:17.458365 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qb52f" podStartSLOduration=2.275176552 podStartE2EDuration="5.458341045s" podCreationTimestamp="2026-03-18 15:56:12 +0000 UTC" firstStartedPulling="2026-03-18 15:56:13.622173471 +0000 UTC m=+1138.221361092" lastFinishedPulling="2026-03-18 15:56:16.805337964 +0000 UTC m=+1141.404525585" observedRunningTime="2026-03-18 15:56:17.454165246 +0000 UTC m=+1142.053352877" watchObservedRunningTime="2026-03-18 15:56:17.458341045 +0000 UTC m=+1142.057528666" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.311212 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pbm2c"] Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.314817 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.322025 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.322603 4939 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7cmnh" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.323731 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.333490 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pbm2c"] Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.407390 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5772h\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-kube-api-access-5772h\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.407523 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.508948 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5772h\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-kube-api-access-5772h\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.509007 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.529847 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.530015 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5772h\" (UniqueName: \"kubernetes.io/projected/145983f6-1dc1-4d35-88a6-577db0b7bd2b-kube-api-access-5772h\") pod \"cert-manager-webhook-6888856db4-pbm2c\" (UID: \"145983f6-1dc1-4d35-88a6-577db0b7bd2b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.638173 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:20 crc kubenswrapper[4939]: I0318 15:56:20.948782 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pbm2c"] Mar 18 15:56:20 crc kubenswrapper[4939]: W0318 15:56:20.959346 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145983f6_1dc1_4d35_88a6_577db0b7bd2b.slice/crio-0ab63a49fa687c5da6b6960bf68effdddba7debebe2976acb4748298af194d68 WatchSource:0}: Error finding container 0ab63a49fa687c5da6b6960bf68effdddba7debebe2976acb4748298af194d68: Status 404 returned error can't find the container with id 0ab63a49fa687c5da6b6960bf68effdddba7debebe2976acb4748298af194d68 Mar 18 15:56:21 crc kubenswrapper[4939]: I0318 15:56:21.461186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" event={"ID":"145983f6-1dc1-4d35-88a6-577db0b7bd2b","Type":"ContainerStarted","Data":"0ab63a49fa687c5da6b6960bf68effdddba7debebe2976acb4748298af194d68"} Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.668591 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jwggk"] Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.669294 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.673731 4939 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wn2tf" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.680626 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jwggk"] Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.741104 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42t5q\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-kube-api-access-42t5q\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.741386 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.842977 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42t5q\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-kube-api-access-42t5q\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.843068 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.862035 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.862920 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42t5q\" (UniqueName: \"kubernetes.io/projected/02decf6c-b69f-4c7e-92ac-bcbe9aaaba30-kube-api-access-42t5q\") pod \"cert-manager-cainjector-5545bd876-jwggk\" (UID: \"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:22 crc kubenswrapper[4939]: I0318 15:56:22.989753 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" Mar 18 15:56:23 crc kubenswrapper[4939]: I0318 15:56:23.428949 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jwggk"] Mar 18 15:56:23 crc kubenswrapper[4939]: W0318 15:56:23.436496 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02decf6c_b69f_4c7e_92ac_bcbe9aaaba30.slice/crio-780ed5f8d886ccc2e4bf6010a9349d13ff1c14ac5e005dec004ac0e6bef3503f WatchSource:0}: Error finding container 780ed5f8d886ccc2e4bf6010a9349d13ff1c14ac5e005dec004ac0e6bef3503f: Status 404 returned error can't find the container with id 780ed5f8d886ccc2e4bf6010a9349d13ff1c14ac5e005dec004ac0e6bef3503f Mar 18 15:56:23 crc kubenswrapper[4939]: I0318 15:56:23.476030 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" event={"ID":"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30","Type":"ContainerStarted","Data":"780ed5f8d886ccc2e4bf6010a9349d13ff1c14ac5e005dec004ac0e6bef3503f"} Mar 18 15:56:26 crc kubenswrapper[4939]: I0318 15:56:26.496643 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" event={"ID":"02decf6c-b69f-4c7e-92ac-bcbe9aaaba30","Type":"ContainerStarted","Data":"9549a528e0a8a3e8a199658f64fc4116c78eba04dbb252a80f0616d5b991f01f"} Mar 18 15:56:26 crc kubenswrapper[4939]: I0318 15:56:26.498772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" event={"ID":"145983f6-1dc1-4d35-88a6-577db0b7bd2b","Type":"ContainerStarted","Data":"65bf4944e6c24ddd8c4cc09ecdbc4c267723a54efc6ce9a8742e5260d6e6ecde"} Mar 18 15:56:26 crc kubenswrapper[4939]: I0318 15:56:26.498959 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:26 crc kubenswrapper[4939]: I0318 15:56:26.511737 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-jwggk" podStartSLOduration=1.914517929 podStartE2EDuration="4.5117162s" podCreationTimestamp="2026-03-18 15:56:22 +0000 UTC" firstStartedPulling="2026-03-18 15:56:23.43933921 +0000 UTC m=+1148.038526831" lastFinishedPulling="2026-03-18 15:56:26.036537481 +0000 UTC m=+1150.635725102" observedRunningTime="2026-03-18 15:56:26.50998806 +0000 UTC m=+1151.109175691" watchObservedRunningTime="2026-03-18 15:56:26.5117162 +0000 UTC m=+1151.110903821" Mar 18 15:56:26 crc kubenswrapper[4939]: I0318 15:56:26.526779 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" podStartSLOduration=1.447025121 podStartE2EDuration="6.526763308s" podCreationTimestamp="2026-03-18 15:56:20 +0000 UTC" firstStartedPulling="2026-03-18 15:56:20.961707204 +0000 UTC m=+1145.560894825" lastFinishedPulling="2026-03-18 15:56:26.041445391 +0000 UTC m=+1150.640633012" observedRunningTime="2026-03-18 15:56:26.524799432 +0000 UTC m=+1151.123987073" watchObservedRunningTime="2026-03-18 15:56:26.526763308 +0000 UTC m=+1151.125950929" Mar 18 15:56:35 crc kubenswrapper[4939]: I0318 15:56:35.642524 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-pbm2c" Mar 18 15:56:37 crc kubenswrapper[4939]: I0318 15:56:37.920147 4939 scope.go:117] "RemoveContainer" containerID="8e6393f06ecf9a139081fb39cd6def0d532087077dae2ddd46636cfd01d83183" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.500765 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-6h7w7"] Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.502247 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.504817 4939 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lg52r" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.508188 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6h7w7"] Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.561741 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mttnp\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-kube-api-access-mttnp\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.562058 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-bound-sa-token\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.663877 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mttnp\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-kube-api-access-mttnp\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.663999 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-bound-sa-token\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.684335 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-bound-sa-token\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.686454 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mttnp\" (UniqueName: \"kubernetes.io/projected/37df594e-01f6-4d64-9b1c-1f1b1d36aecf-kube-api-access-mttnp\") pod \"cert-manager-545d4d4674-6h7w7\" (UID: \"37df594e-01f6-4d64-9b1c-1f1b1d36aecf\") " pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:39 crc kubenswrapper[4939]: I0318 15:56:39.829401 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6h7w7" Mar 18 15:56:40 crc kubenswrapper[4939]: I0318 15:56:40.247343 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6h7w7"] Mar 18 15:56:40 crc kubenswrapper[4939]: W0318 15:56:40.250752 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37df594e_01f6_4d64_9b1c_1f1b1d36aecf.slice/crio-f0f6bf73d3fef7435dc756cc4b3ff92f50601240f84d6091a294c6c27c881e03 WatchSource:0}: Error finding container f0f6bf73d3fef7435dc756cc4b3ff92f50601240f84d6091a294c6c27c881e03: Status 404 returned error can't find the container with id f0f6bf73d3fef7435dc756cc4b3ff92f50601240f84d6091a294c6c27c881e03 Mar 18 15:56:40 crc kubenswrapper[4939]: I0318 15:56:40.594260 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6h7w7" event={"ID":"37df594e-01f6-4d64-9b1c-1f1b1d36aecf","Type":"ContainerStarted","Data":"f68d5be1d24449981d78b3eb8009b575d4068d01fb4fa7e7ecbc5d8a2c76e918"} Mar 18 15:56:40 crc kubenswrapper[4939]: I0318 15:56:40.594618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6h7w7" event={"ID":"37df594e-01f6-4d64-9b1c-1f1b1d36aecf","Type":"ContainerStarted","Data":"f0f6bf73d3fef7435dc756cc4b3ff92f50601240f84d6091a294c6c27c881e03"} Mar 18 15:56:40 crc kubenswrapper[4939]: I0318 15:56:40.614612 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-6h7w7" podStartSLOduration=1.6145882299999998 podStartE2EDuration="1.61458823s" podCreationTimestamp="2026-03-18 15:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:40.609527586 +0000 UTC m=+1165.208715207" watchObservedRunningTime="2026-03-18 15:56:40.61458823 +0000 UTC m=+1165.213775851" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.076698 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.080436 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.083778 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wrq94" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.088146 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.092147 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.101890 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.194752 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8cq\" (UniqueName: \"kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq\") pod \"openstack-operator-index-q8xzw\" (UID: \"9edade3c-f761-4e5b-a7b2-518afded0093\") " pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.296178 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8cq\" (UniqueName: \"kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq\") pod \"openstack-operator-index-q8xzw\" (UID: \"9edade3c-f761-4e5b-a7b2-518afded0093\") " pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.314977 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8cq\" (UniqueName: \"kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq\") pod \"openstack-operator-index-q8xzw\" (UID: \"9edade3c-f761-4e5b-a7b2-518afded0093\") " pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.406074 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.681536 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:49 crc kubenswrapper[4939]: I0318 15:56:49.684794 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q8xzw" event={"ID":"9edade3c-f761-4e5b-a7b2-518afded0093","Type":"ContainerStarted","Data":"2682b4cbaac4f44ac65f787a0b244d4f739cd4f62651f6e840ec59254e3b467e"} Mar 18 15:56:52 crc kubenswrapper[4939]: I0318 15:56:52.434085 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.045625 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ng2rp"] Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.047878 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.062464 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ng2rp"] Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.152791 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfjr\" (UniqueName: \"kubernetes.io/projected/99136bd4-35be-493d-9913-3c569a8861c0-kube-api-access-9jfjr\") pod \"openstack-operator-index-ng2rp\" (UID: \"99136bd4-35be-493d-9913-3c569a8861c0\") " pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.254955 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfjr\" (UniqueName: \"kubernetes.io/projected/99136bd4-35be-493d-9913-3c569a8861c0-kube-api-access-9jfjr\") pod \"openstack-operator-index-ng2rp\" (UID: \"99136bd4-35be-493d-9913-3c569a8861c0\") " pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.291823 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfjr\" (UniqueName: \"kubernetes.io/projected/99136bd4-35be-493d-9913-3c569a8861c0-kube-api-access-9jfjr\") pod \"openstack-operator-index-ng2rp\" (UID: \"99136bd4-35be-493d-9913-3c569a8861c0\") " pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.377849 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.722271 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q8xzw" event={"ID":"9edade3c-f761-4e5b-a7b2-518afded0093","Type":"ContainerStarted","Data":"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684"} Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.722378 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-q8xzw" podUID="9edade3c-f761-4e5b-a7b2-518afded0093" containerName="registry-server" containerID="cri-o://c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684" gracePeriod=2 Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.740283 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-q8xzw" podStartSLOduration=1.747361822 podStartE2EDuration="4.740265438s" podCreationTimestamp="2026-03-18 15:56:49 +0000 UTC" firstStartedPulling="2026-03-18 15:56:49.674972871 +0000 UTC m=+1174.274160502" lastFinishedPulling="2026-03-18 15:56:52.667876497 +0000 UTC m=+1177.267064118" observedRunningTime="2026-03-18 15:56:53.736697236 +0000 UTC m=+1178.335884897" watchObservedRunningTime="2026-03-18 15:56:53.740265438 +0000 UTC m=+1178.339453069" Mar 18 15:56:53 crc kubenswrapper[4939]: I0318 15:56:53.860034 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ng2rp"] Mar 18 15:56:53 crc kubenswrapper[4939]: W0318 15:56:53.865761 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99136bd4_35be_493d_9913_3c569a8861c0.slice/crio-04160566d250cac3c0a3f8f702e090b895d496beeecfd979ee16e0ab5fb34987 WatchSource:0}: Error finding container 04160566d250cac3c0a3f8f702e090b895d496beeecfd979ee16e0ab5fb34987: Status 404 returned error can't find the container with id 04160566d250cac3c0a3f8f702e090b895d496beeecfd979ee16e0ab5fb34987 Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.198038 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.371046 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8cq\" (UniqueName: \"kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq\") pod \"9edade3c-f761-4e5b-a7b2-518afded0093\" (UID: \"9edade3c-f761-4e5b-a7b2-518afded0093\") " Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.378577 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq" (OuterVolumeSpecName: "kube-api-access-7j8cq") pod "9edade3c-f761-4e5b-a7b2-518afded0093" (UID: "9edade3c-f761-4e5b-a7b2-518afded0093"). InnerVolumeSpecName "kube-api-access-7j8cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.473140 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8cq\" (UniqueName: \"kubernetes.io/projected/9edade3c-f761-4e5b-a7b2-518afded0093-kube-api-access-7j8cq\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.733852 4939 generic.go:334] "Generic (PLEG): container finished" podID="9edade3c-f761-4e5b-a7b2-518afded0093" containerID="c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684" exitCode=0 Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.733965 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q8xzw" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.734046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q8xzw" event={"ID":"9edade3c-f761-4e5b-a7b2-518afded0093","Type":"ContainerDied","Data":"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684"} Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.734137 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q8xzw" event={"ID":"9edade3c-f761-4e5b-a7b2-518afded0093","Type":"ContainerDied","Data":"2682b4cbaac4f44ac65f787a0b244d4f739cd4f62651f6e840ec59254e3b467e"} Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.734173 4939 scope.go:117] "RemoveContainer" containerID="c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.742368 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ng2rp" event={"ID":"99136bd4-35be-493d-9913-3c569a8861c0","Type":"ContainerStarted","Data":"74cd7ef80f5798d2d25f2352a4181bbc06288da5b5ae98415880f7391843489d"} Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.742464 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ng2rp" event={"ID":"99136bd4-35be-493d-9913-3c569a8861c0","Type":"ContainerStarted","Data":"04160566d250cac3c0a3f8f702e090b895d496beeecfd979ee16e0ab5fb34987"} Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.768239 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ng2rp" podStartSLOduration=1.7142043359999999 podStartE2EDuration="1.768212543s" podCreationTimestamp="2026-03-18 15:56:53 +0000 UTC" firstStartedPulling="2026-03-18 15:56:53.87099954 +0000 UTC m=+1178.470187191" lastFinishedPulling="2026-03-18 15:56:53.925007767 +0000 UTC m=+1178.524195398" observedRunningTime="2026-03-18 15:56:54.762983344 +0000 UTC m=+1179.362170985" watchObservedRunningTime="2026-03-18 15:56:54.768212543 +0000 UTC m=+1179.367400174" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.778234 4939 scope.go:117] "RemoveContainer" containerID="c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684" Mar 18 15:56:54 crc kubenswrapper[4939]: E0318 15:56:54.779219 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684\": container with ID starting with c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684 not found: ID does not exist" containerID="c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.779274 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684"} err="failed to get container status \"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684\": rpc error: code = NotFound desc = could not find container \"c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684\": container with ID starting with c77e2d7b1e6b1e5c38b4df311183f44d95a7fc47554d5ad18d438df03f25f684 not found: ID does not exist" Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.789571 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:54 crc kubenswrapper[4939]: I0318 15:56:54.795426 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-q8xzw"] Mar 18 15:56:56 crc kubenswrapper[4939]: I0318 15:56:56.148075 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edade3c-f761-4e5b-a7b2-518afded0093" path="/var/lib/kubelet/pods/9edade3c-f761-4e5b-a7b2-518afded0093/volumes" Mar 18 15:57:03 crc kubenswrapper[4939]: I0318 15:57:03.378746 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:57:03 crc kubenswrapper[4939]: I0318 15:57:03.379402 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:57:03 crc kubenswrapper[4939]: I0318 15:57:03.415083 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:57:03 crc kubenswrapper[4939]: I0318 15:57:03.854013 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ng2rp" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.487939 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj"] Mar 18 15:57:05 crc kubenswrapper[4939]: E0318 15:57:05.488182 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edade3c-f761-4e5b-a7b2-518afded0093" containerName="registry-server" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.488194 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edade3c-f761-4e5b-a7b2-518afded0093" containerName="registry-server" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.488307 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edade3c-f761-4e5b-a7b2-518afded0093" containerName="registry-server" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.489101 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.491855 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zsdr9" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.507339 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj"] Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.640858 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.640961 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c98g7\" (UniqueName: \"kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.641000 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.742645 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.743097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c98g7\" (UniqueName: \"kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.743436 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.743886 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.744166 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.782427 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c98g7\" (UniqueName: \"kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7\") pod \"2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:05 crc kubenswrapper[4939]: I0318 15:57:05.817712 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:06 crc kubenswrapper[4939]: I0318 15:57:06.061588 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj"] Mar 18 15:57:06 crc kubenswrapper[4939]: I0318 15:57:06.847906 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" event={"ID":"5422988b-65bc-4039-a7d4-6877eb1f0e65","Type":"ContainerDied","Data":"1d84be6758f9c157328e8ca2c4c5f93c1559c16bac9d0a130e44c10ab0dbaf55"} Mar 18 15:57:06 crc kubenswrapper[4939]: I0318 15:57:06.847818 4939 generic.go:334] "Generic (PLEG): container finished" podID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerID="1d84be6758f9c157328e8ca2c4c5f93c1559c16bac9d0a130e44c10ab0dbaf55" exitCode=0 Mar 18 15:57:06 crc kubenswrapper[4939]: I0318 15:57:06.848083 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" event={"ID":"5422988b-65bc-4039-a7d4-6877eb1f0e65","Type":"ContainerStarted","Data":"44a2ed4930fb4b2d9aea0a26af120f4b7523795dc48165eac0acf0a0fb272f15"} Mar 18 15:57:06 crc kubenswrapper[4939]: I0318 15:57:06.852369 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:57:07 crc kubenswrapper[4939]: I0318 15:57:07.859473 4939 generic.go:334] "Generic (PLEG): container finished" podID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerID="4f85b8196bc3443f6731c2579de9b716f8108f01600a72038c918ea97a0a6e2a" exitCode=0 Mar 18 15:57:07 crc kubenswrapper[4939]: I0318 15:57:07.859542 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" event={"ID":"5422988b-65bc-4039-a7d4-6877eb1f0e65","Type":"ContainerDied","Data":"4f85b8196bc3443f6731c2579de9b716f8108f01600a72038c918ea97a0a6e2a"} Mar 18 15:57:08 crc kubenswrapper[4939]: I0318 15:57:08.869293 4939 generic.go:334] "Generic (PLEG): container finished" podID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerID="53f153df79e595f478e33ce45118fbb4b0007888c2214c82a261bbe450efc61a" exitCode=0 Mar 18 15:57:08 crc kubenswrapper[4939]: I0318 15:57:08.869338 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" event={"ID":"5422988b-65bc-4039-a7d4-6877eb1f0e65","Type":"ContainerDied","Data":"53f153df79e595f478e33ce45118fbb4b0007888c2214c82a261bbe450efc61a"} Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.229015 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.310072 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util\") pod \"5422988b-65bc-4039-a7d4-6877eb1f0e65\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.310136 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle\") pod \"5422988b-65bc-4039-a7d4-6877eb1f0e65\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.310175 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c98g7\" (UniqueName: \"kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7\") pod \"5422988b-65bc-4039-a7d4-6877eb1f0e65\" (UID: \"5422988b-65bc-4039-a7d4-6877eb1f0e65\") " Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.311157 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle" (OuterVolumeSpecName: "bundle") pod "5422988b-65bc-4039-a7d4-6877eb1f0e65" (UID: "5422988b-65bc-4039-a7d4-6877eb1f0e65"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.315683 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7" (OuterVolumeSpecName: "kube-api-access-c98g7") pod "5422988b-65bc-4039-a7d4-6877eb1f0e65" (UID: "5422988b-65bc-4039-a7d4-6877eb1f0e65"). InnerVolumeSpecName "kube-api-access-c98g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.326443 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util" (OuterVolumeSpecName: "util") pod "5422988b-65bc-4039-a7d4-6877eb1f0e65" (UID: "5422988b-65bc-4039-a7d4-6877eb1f0e65"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.412329 4939 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.412394 4939 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5422988b-65bc-4039-a7d4-6877eb1f0e65-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.412412 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c98g7\" (UniqueName: \"kubernetes.io/projected/5422988b-65bc-4039-a7d4-6877eb1f0e65-kube-api-access-c98g7\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.887492 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" event={"ID":"5422988b-65bc-4039-a7d4-6877eb1f0e65","Type":"ContainerDied","Data":"44a2ed4930fb4b2d9aea0a26af120f4b7523795dc48165eac0acf0a0fb272f15"} Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.887570 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a2ed4930fb4b2d9aea0a26af120f4b7523795dc48165eac0acf0a0fb272f15" Mar 18 15:57:10 crc kubenswrapper[4939]: I0318 15:57:10.887584 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.197606 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn"] Mar 18 15:57:18 crc kubenswrapper[4939]: E0318 15:57:18.198721 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="util" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.198745 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="util" Mar 18 15:57:18 crc kubenswrapper[4939]: E0318 15:57:18.198758 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="extract" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.198769 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="extract" Mar 18 15:57:18 crc kubenswrapper[4939]: E0318 15:57:18.198802 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="pull" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.198814 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="pull" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.198978 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5422988b-65bc-4039-a7d4-6877eb1f0e65" containerName="extract" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.199670 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.202842 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-9z57g" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.217776 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn"] Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.331146 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmtkp\" (UniqueName: \"kubernetes.io/projected/84d8418e-14e0-40b8-aae5-8bf2dbdce02b-kube-api-access-mmtkp\") pod \"openstack-operator-controller-init-7bc867c5bc-kdvmn\" (UID: \"84d8418e-14e0-40b8-aae5-8bf2dbdce02b\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.432117 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmtkp\" (UniqueName: \"kubernetes.io/projected/84d8418e-14e0-40b8-aae5-8bf2dbdce02b-kube-api-access-mmtkp\") pod \"openstack-operator-controller-init-7bc867c5bc-kdvmn\" (UID: \"84d8418e-14e0-40b8-aae5-8bf2dbdce02b\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.453150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmtkp\" (UniqueName: \"kubernetes.io/projected/84d8418e-14e0-40b8-aae5-8bf2dbdce02b-kube-api-access-mmtkp\") pod \"openstack-operator-controller-init-7bc867c5bc-kdvmn\" (UID: \"84d8418e-14e0-40b8-aae5-8bf2dbdce02b\") " pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.517168 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.749185 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn"] Mar 18 15:57:18 crc kubenswrapper[4939]: W0318 15:57:18.755012 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d8418e_14e0_40b8_aae5_8bf2dbdce02b.slice/crio-d05f2c4ffd096105534975ecffa73db60f1fcc684260e114e93883b2f65540eb WatchSource:0}: Error finding container d05f2c4ffd096105534975ecffa73db60f1fcc684260e114e93883b2f65540eb: Status 404 returned error can't find the container with id d05f2c4ffd096105534975ecffa73db60f1fcc684260e114e93883b2f65540eb Mar 18 15:57:18 crc kubenswrapper[4939]: I0318 15:57:18.951868 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" event={"ID":"84d8418e-14e0-40b8-aae5-8bf2dbdce02b","Type":"ContainerStarted","Data":"d05f2c4ffd096105534975ecffa73db60f1fcc684260e114e93883b2f65540eb"} Mar 18 15:57:22 crc kubenswrapper[4939]: I0318 15:57:22.985975 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" event={"ID":"84d8418e-14e0-40b8-aae5-8bf2dbdce02b","Type":"ContainerStarted","Data":"d1bf7ad83c6d048e5e4065f394cacaaa414b3986458ce8d633f9ad9309dd4d50"} Mar 18 15:57:22 crc kubenswrapper[4939]: I0318 15:57:22.986614 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:23 crc kubenswrapper[4939]: I0318 15:57:23.027986 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" podStartSLOduration=1.349941888 podStartE2EDuration="5.02797042s" podCreationTimestamp="2026-03-18 15:57:18 +0000 UTC" firstStartedPulling="2026-03-18 15:57:18.756869174 +0000 UTC m=+1203.356056795" lastFinishedPulling="2026-03-18 15:57:22.434897706 +0000 UTC m=+1207.034085327" observedRunningTime="2026-03-18 15:57:23.023467392 +0000 UTC m=+1207.622655023" watchObservedRunningTime="2026-03-18 15:57:23.02797042 +0000 UTC m=+1207.627158041" Mar 18 15:57:23 crc kubenswrapper[4939]: I0318 15:57:23.687163 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:57:23 crc kubenswrapper[4939]: I0318 15:57:23.687420 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:57:28 crc kubenswrapper[4939]: I0318 15:57:28.520590 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7bc867c5bc-kdvmn" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.141121 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.142373 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.142487 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.154845 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-klwsh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.174011 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.177587 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ql8t8" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.185452 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.209830 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.227312 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.228539 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.235142 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-wzvmd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.248678 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.249672 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.253793 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mjj7t" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.256757 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.257736 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.260185 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gkxt9" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.279629 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.280442 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhrz\" (UniqueName: \"kubernetes.io/projected/9dd46745-f668-4ec4-a61e-7ecf0f91bd8a-kube-api-access-mxhrz\") pod \"barbican-operator-controller-manager-5cfd84c587-9mhmj\" (UID: \"9dd46745-f668-4ec4-a61e-7ecf0f91bd8a\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.280631 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wp5m\" (UniqueName: \"kubernetes.io/projected/c2cd123f-732e-4e44-b5f0-f79d15ea87da-kube-api-access-4wp5m\") pod \"cinder-operator-controller-manager-6d77645966-wvfdv\" (UID: \"c2cd123f-732e-4e44-b5f0-f79d15ea87da\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.317061 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.319695 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.333749 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.334708 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.338090 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.338949 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.339685 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pkp2t" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.342371 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.343183 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.343591 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wsv5b" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.347932 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.348660 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.353337 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-88n7g" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.362043 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.373709 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.384533 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wp5m\" (UniqueName: \"kubernetes.io/projected/c2cd123f-732e-4e44-b5f0-f79d15ea87da-kube-api-access-4wp5m\") pod \"cinder-operator-controller-manager-6d77645966-wvfdv\" (UID: \"c2cd123f-732e-4e44-b5f0-f79d15ea87da\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.384628 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqsd\" (UniqueName: \"kubernetes.io/projected/dae9e528-b32e-4cf9-a032-2a3bc38320f6-kube-api-access-knqsd\") pod \"designate-operator-controller-manager-6cc65c69fc-pfmz6\" (UID: \"dae9e528-b32e-4cf9-a032-2a3bc38320f6\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.384675 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdm6j\" (UniqueName: \"kubernetes.io/projected/ffe00762-ce07-4018-b8d6-290de61644d0-kube-api-access-hdm6j\") pod \"glance-operator-controller-manager-7d559dcdbd-xlmmb\" (UID: \"ffe00762-ce07-4018-b8d6-290de61644d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.384721 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhrz\" (UniqueName: \"kubernetes.io/projected/9dd46745-f668-4ec4-a61e-7ecf0f91bd8a-kube-api-access-mxhrz\") pod \"barbican-operator-controller-manager-5cfd84c587-9mhmj\" (UID: \"9dd46745-f668-4ec4-a61e-7ecf0f91bd8a\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.384761 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctq9\" (UniqueName: \"kubernetes.io/projected/13e35677-42a2-48c1-86fc-f022708ac217-kube-api-access-lctq9\") pod \"heat-operator-controller-manager-66dd9d474d-2w7ms\" (UID: \"13e35677-42a2-48c1-86fc-f022708ac217\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.389096 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.390010 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.393943 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.394919 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.396881 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-8svps" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.397054 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-56wqf" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.410074 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.417664 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wp5m\" (UniqueName: \"kubernetes.io/projected/c2cd123f-732e-4e44-b5f0-f79d15ea87da-kube-api-access-4wp5m\") pod \"cinder-operator-controller-manager-6d77645966-wvfdv\" (UID: \"c2cd123f-732e-4e44-b5f0-f79d15ea87da\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.428704 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.429456 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.431128 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jg2nd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.435935 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhrz\" (UniqueName: \"kubernetes.io/projected/9dd46745-f668-4ec4-a61e-7ecf0f91bd8a-kube-api-access-mxhrz\") pod \"barbican-operator-controller-manager-5cfd84c587-9mhmj\" (UID: \"9dd46745-f668-4ec4-a61e-7ecf0f91bd8a\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.450236 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.461812 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.463011 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.469489 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-m548c" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.470978 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.473952 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.484628 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485736 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2g2b\" (UniqueName: \"kubernetes.io/projected/54f258e9-7e39-422a-b848-2cbfc2633529-kube-api-access-r2g2b\") pod \"keystone-operator-controller-manager-76b87776c9-992ww\" (UID: \"54f258e9-7e39-422a-b848-2cbfc2633529\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485773 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485796 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lctq9\" (UniqueName: \"kubernetes.io/projected/13e35677-42a2-48c1-86fc-f022708ac217-kube-api-access-lctq9\") pod \"heat-operator-controller-manager-66dd9d474d-2w7ms\" (UID: \"13e35677-42a2-48c1-86fc-f022708ac217\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485819 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtb8\" (UniqueName: \"kubernetes.io/projected/d981a7ad-8442-4742-a3be-ca3667fe0f9f-kube-api-access-gvtb8\") pod \"ironic-operator-controller-manager-6b77b7676d-78b9l\" (UID: \"d981a7ad-8442-4742-a3be-ca3667fe0f9f\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485843 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttm9w\" (UniqueName: \"kubernetes.io/projected/fe507f50-8673-4c7e-84e7-b649c6d61115-kube-api-access-ttm9w\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485885 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqsd\" (UniqueName: \"kubernetes.io/projected/dae9e528-b32e-4cf9-a032-2a3bc38320f6-kube-api-access-knqsd\") pod \"designate-operator-controller-manager-6cc65c69fc-pfmz6\" (UID: \"dae9e528-b32e-4cf9-a032-2a3bc38320f6\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485905 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5nq\" (UniqueName: \"kubernetes.io/projected/e4983266-0ae5-493d-86a4-4f6b010e95d8-kube-api-access-fd5nq\") pod \"manila-operator-controller-manager-fbf7bbb96-bpt6p\" (UID: \"e4983266-0ae5-493d-86a4-4f6b010e95d8\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485932 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdm6j\" (UniqueName: \"kubernetes.io/projected/ffe00762-ce07-4018-b8d6-290de61644d0-kube-api-access-hdm6j\") pod \"glance-operator-controller-manager-7d559dcdbd-xlmmb\" (UID: \"ffe00762-ce07-4018-b8d6-290de61644d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.485952 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thgx\" (UniqueName: \"kubernetes.io/projected/e7737b63-d9f2-45e5-a1f5-b1300aff04a8-kube-api-access-4thgx\") pod \"horizon-operator-controller-manager-64dc66d669-djxqp\" (UID: \"e7737b63-d9f2-45e5-a1f5-b1300aff04a8\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.486044 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.493427 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4cjwq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.496212 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.499761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.505887 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqsd\" (UniqueName: \"kubernetes.io/projected/dae9e528-b32e-4cf9-a032-2a3bc38320f6-kube-api-access-knqsd\") pod \"designate-operator-controller-manager-6cc65c69fc-pfmz6\" (UID: \"dae9e528-b32e-4cf9-a032-2a3bc38320f6\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.506349 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdm6j\" (UniqueName: \"kubernetes.io/projected/ffe00762-ce07-4018-b8d6-290de61644d0-kube-api-access-hdm6j\") pod \"glance-operator-controller-manager-7d559dcdbd-xlmmb\" (UID: \"ffe00762-ce07-4018-b8d6-290de61644d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.509116 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctq9\" (UniqueName: \"kubernetes.io/projected/13e35677-42a2-48c1-86fc-f022708ac217-kube-api-access-lctq9\") pod \"heat-operator-controller-manager-66dd9d474d-2w7ms\" (UID: \"13e35677-42a2-48c1-86fc-f022708ac217\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.521632 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.522871 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.527051 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pc4br" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.538144 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.550136 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.565463 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.567576 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.572196 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.573074 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.575994 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.576381 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-s5bz6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587182 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4thgx\" (UniqueName: \"kubernetes.io/projected/e7737b63-d9f2-45e5-a1f5-b1300aff04a8-kube-api-access-4thgx\") pod \"horizon-operator-controller-manager-64dc66d669-djxqp\" (UID: \"e7737b63-d9f2-45e5-a1f5-b1300aff04a8\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587279 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8j9\" (UniqueName: \"kubernetes.io/projected/23fb02fd-5d5a-488a-b3a7-82c69eb91e32-kube-api-access-7k8j9\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-xjg7j\" (UID: \"23fb02fd-5d5a-488a-b3a7-82c69eb91e32\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587333 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2g2b\" (UniqueName: \"kubernetes.io/projected/54f258e9-7e39-422a-b848-2cbfc2633529-kube-api-access-r2g2b\") pod \"keystone-operator-controller-manager-76b87776c9-992ww\" (UID: \"54f258e9-7e39-422a-b848-2cbfc2633529\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587357 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mzk\" (UniqueName: \"kubernetes.io/projected/3f4b5f2d-578c-42f1-a283-960b728a9ef7-kube-api-access-r6mzk\") pod \"nova-operator-controller-manager-bc5c78db9-wmt6q\" (UID: \"3f4b5f2d-578c-42f1-a283-960b728a9ef7\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587613 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587705 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtb8\" (UniqueName: \"kubernetes.io/projected/d981a7ad-8442-4742-a3be-ca3667fe0f9f-kube-api-access-gvtb8\") pod \"ironic-operator-controller-manager-6b77b7676d-78b9l\" (UID: \"d981a7ad-8442-4742-a3be-ca3667fe0f9f\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587737 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttm9w\" (UniqueName: \"kubernetes.io/projected/fe507f50-8673-4c7e-84e7-b649c6d61115-kube-api-access-ttm9w\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.587803 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5nq\" (UniqueName: \"kubernetes.io/projected/e4983266-0ae5-493d-86a4-4f6b010e95d8-kube-api-access-fd5nq\") pod \"manila-operator-controller-manager-fbf7bbb96-bpt6p\" (UID: \"e4983266-0ae5-493d-86a4-4f6b010e95d8\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.590444 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.599156 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9qs\" (UniqueName: \"kubernetes.io/projected/089530fd-7678-4f52-9e8c-8af0f54a9d15-kube-api-access-sc9qs\") pod \"neutron-operator-controller-manager-6744dd545c-6xvbd\" (UID: \"089530fd-7678-4f52-9e8c-8af0f54a9d15\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:57:48 crc kubenswrapper[4939]: E0318 15:57:48.599367 4939 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:48 crc kubenswrapper[4939]: E0318 15:57:48.599774 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert podName:fe507f50-8673-4c7e-84e7-b649c6d61115 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:49.099757402 +0000 UTC m=+1233.698945023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert") pod "infra-operator-controller-manager-5595c7d6ff-hwlg6" (UID: "fe507f50-8673-4c7e-84e7-b649c6d61115") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.603429 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.605556 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2g2b\" (UniqueName: \"kubernetes.io/projected/54f258e9-7e39-422a-b848-2cbfc2633529-kube-api-access-r2g2b\") pod \"keystone-operator-controller-manager-76b87776c9-992ww\" (UID: \"54f258e9-7e39-422a-b848-2cbfc2633529\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.605654 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.611193 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thgx\" (UniqueName: \"kubernetes.io/projected/e7737b63-d9f2-45e5-a1f5-b1300aff04a8-kube-api-access-4thgx\") pod \"horizon-operator-controller-manager-64dc66d669-djxqp\" (UID: \"e7737b63-d9f2-45e5-a1f5-b1300aff04a8\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.612450 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xxt4m" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.654493 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.662231 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5nq\" (UniqueName: \"kubernetes.io/projected/e4983266-0ae5-493d-86a4-4f6b010e95d8-kube-api-access-fd5nq\") pod \"manila-operator-controller-manager-fbf7bbb96-bpt6p\" (UID: \"e4983266-0ae5-493d-86a4-4f6b010e95d8\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.662300 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttm9w\" (UniqueName: \"kubernetes.io/projected/fe507f50-8673-4c7e-84e7-b649c6d61115-kube-api-access-ttm9w\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.662857 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.666282 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtb8\" (UniqueName: \"kubernetes.io/projected/d981a7ad-8442-4742-a3be-ca3667fe0f9f-kube-api-access-gvtb8\") pod \"ironic-operator-controller-manager-6b77b7676d-78b9l\" (UID: \"d981a7ad-8442-4742-a3be-ca3667fe0f9f\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.670325 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.672027 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.678078 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-598hk" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.697834 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.699085 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.699962 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8j9\" (UniqueName: \"kubernetes.io/projected/23fb02fd-5d5a-488a-b3a7-82c69eb91e32-kube-api-access-7k8j9\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-xjg7j\" (UID: \"23fb02fd-5d5a-488a-b3a7-82c69eb91e32\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.699993 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnd5t\" (UniqueName: \"kubernetes.io/projected/5f476e97-8e28-48d0-9b5e-79c31bb8e9fd-kube-api-access-vnd5t\") pod \"octavia-operator-controller-manager-56f74467c6-hzsbq\" (UID: \"5f476e97-8e28-48d0-9b5e-79c31bb8e9fd\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.700022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mzk\" (UniqueName: \"kubernetes.io/projected/3f4b5f2d-578c-42f1-a283-960b728a9ef7-kube-api-access-r6mzk\") pod \"nova-operator-controller-manager-bc5c78db9-wmt6q\" (UID: \"3f4b5f2d-578c-42f1-a283-960b728a9ef7\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.700045 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.700060 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhndm\" (UniqueName: \"kubernetes.io/projected/573a3014-37a6-461d-a211-2fba3e06a72d-kube-api-access-jhndm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.700129 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsn9t\" (UniqueName: \"kubernetes.io/projected/0bb8519c-c720-4853-a031-caedb2059b3d-kube-api-access-zsn9t\") pod \"ovn-operator-controller-manager-846c4cdcb7-4xhw6\" (UID: \"0bb8519c-c720-4853-a031-caedb2059b3d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.700153 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9qs\" (UniqueName: \"kubernetes.io/projected/089530fd-7678-4f52-9e8c-8af0f54a9d15-kube-api-access-sc9qs\") pod \"neutron-operator-controller-manager-6744dd545c-6xvbd\" (UID: \"089530fd-7678-4f52-9e8c-8af0f54a9d15\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.717073 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.720089 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.728148 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mzk\" (UniqueName: \"kubernetes.io/projected/3f4b5f2d-578c-42f1-a283-960b728a9ef7-kube-api-access-r6mzk\") pod \"nova-operator-controller-manager-bc5c78db9-wmt6q\" (UID: \"3f4b5f2d-578c-42f1-a283-960b728a9ef7\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.728626 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8j9\" (UniqueName: \"kubernetes.io/projected/23fb02fd-5d5a-488a-b3a7-82c69eb91e32-kube-api-access-7k8j9\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-xjg7j\" (UID: \"23fb02fd-5d5a-488a-b3a7-82c69eb91e32\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.735457 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9qs\" (UniqueName: \"kubernetes.io/projected/089530fd-7678-4f52-9e8c-8af0f54a9d15-kube-api-access-sc9qs\") pod \"neutron-operator-controller-manager-6744dd545c-6xvbd\" (UID: \"089530fd-7678-4f52-9e8c-8af0f54a9d15\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.740068 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.741677 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.746737 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sgfq2" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.756577 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.774339 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.785249 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.801604 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnd5t\" (UniqueName: \"kubernetes.io/projected/5f476e97-8e28-48d0-9b5e-79c31bb8e9fd-kube-api-access-vnd5t\") pod \"octavia-operator-controller-manager-56f74467c6-hzsbq\" (UID: \"5f476e97-8e28-48d0-9b5e-79c31bb8e9fd\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.801654 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.801674 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhndm\" (UniqueName: \"kubernetes.io/projected/573a3014-37a6-461d-a211-2fba3e06a72d-kube-api-access-jhndm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.801743 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjcx\" (UniqueName: \"kubernetes.io/projected/41644021-455b-493c-931f-9bc7fc0b8f8e-kube-api-access-8rjcx\") pod \"placement-operator-controller-manager-659fb58c6b-z2bvc\" (UID: \"41644021-455b-493c-931f-9bc7fc0b8f8e\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.801771 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsn9t\" (UniqueName: \"kubernetes.io/projected/0bb8519c-c720-4853-a031-caedb2059b3d-kube-api-access-zsn9t\") pod \"ovn-operator-controller-manager-846c4cdcb7-4xhw6\" (UID: \"0bb8519c-c720-4853-a031-caedb2059b3d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:57:48 crc kubenswrapper[4939]: E0318 15:57:48.803737 4939 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:48 crc kubenswrapper[4939]: E0318 15:57:48.803821 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert podName:573a3014-37a6-461d-a211-2fba3e06a72d nodeName:}" failed. No retries permitted until 2026-03-18 15:57:49.3038031 +0000 UTC m=+1233.902990711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" (UID: "573a3014-37a6-461d-a211-2fba3e06a72d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.805611 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.810849 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.815120 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mv9m7" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.822762 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.827344 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnd5t\" (UniqueName: \"kubernetes.io/projected/5f476e97-8e28-48d0-9b5e-79c31bb8e9fd-kube-api-access-vnd5t\") pod \"octavia-operator-controller-manager-56f74467c6-hzsbq\" (UID: \"5f476e97-8e28-48d0-9b5e-79c31bb8e9fd\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.830328 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsn9t\" (UniqueName: \"kubernetes.io/projected/0bb8519c-c720-4853-a031-caedb2059b3d-kube-api-access-zsn9t\") pod \"ovn-operator-controller-manager-846c4cdcb7-4xhw6\" (UID: \"0bb8519c-c720-4853-a031-caedb2059b3d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.832634 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhndm\" (UniqueName: \"kubernetes.io/projected/573a3014-37a6-461d-a211-2fba3e06a72d-kube-api-access-jhndm\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.864558 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.887633 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.895014 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.905159 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7f8zp" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.906594 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42x8f\" (UniqueName: \"kubernetes.io/projected/6cda5777-cac2-403f-bb77-44e76f6a0806-kube-api-access-42x8f\") pod \"telemetry-operator-controller-manager-6d84559f47-z9fgs\" (UID: \"6cda5777-cac2-403f-bb77-44e76f6a0806\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.906639 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjcx\" (UniqueName: \"kubernetes.io/projected/41644021-455b-493c-931f-9bc7fc0b8f8e-kube-api-access-8rjcx\") pod \"placement-operator-controller-manager-659fb58c6b-z2bvc\" (UID: \"41644021-455b-493c-931f-9bc7fc0b8f8e\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.906684 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bq8\" (UniqueName: \"kubernetes.io/projected/62f8136e-6a5a-49ac-9c21-621384d4102f-kube-api-access-s2bq8\") pod \"swift-operator-controller-manager-867f54bc44-lmrlz\" (UID: \"62f8136e-6a5a-49ac-9c21-621384d4102f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.908640 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.929906 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.934423 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjcx\" (UniqueName: \"kubernetes.io/projected/41644021-455b-493c-931f-9bc7fc0b8f8e-kube-api-access-8rjcx\") pod \"placement-operator-controller-manager-659fb58c6b-z2bvc\" (UID: \"41644021-455b-493c-931f-9bc7fc0b8f8e\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.964343 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.965442 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.965748 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.968545 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5fjdb" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.974868 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.980143 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.988833 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt"] Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.989976 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.992747 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-g9gwk" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.992988 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.995295 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 15:57:48 crc kubenswrapper[4939]: I0318 15:57:48.996010 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.005919 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.007147 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.008067 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42x8f\" (UniqueName: \"kubernetes.io/projected/6cda5777-cac2-403f-bb77-44e76f6a0806-kube-api-access-42x8f\") pod \"telemetry-operator-controller-manager-6d84559f47-z9fgs\" (UID: \"6cda5777-cac2-403f-bb77-44e76f6a0806\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.008117 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bq8\" (UniqueName: \"kubernetes.io/projected/62f8136e-6a5a-49ac-9c21-621384d4102f-kube-api-access-s2bq8\") pod \"swift-operator-controller-manager-867f54bc44-lmrlz\" (UID: \"62f8136e-6a5a-49ac-9c21-621384d4102f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.008176 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlh9\" (UniqueName: \"kubernetes.io/projected/c455d0aa-0b25-4684-997f-b7d2156c2ac8-kube-api-access-cvlh9\") pod \"test-operator-controller-manager-8467ccb4c8-5xlqd\" (UID: \"c455d0aa-0b25-4684-997f-b7d2156c2ac8\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.009452 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.010359 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cqmxz" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.013904 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.023967 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42x8f\" (UniqueName: \"kubernetes.io/projected/6cda5777-cac2-403f-bb77-44e76f6a0806-kube-api-access-42x8f\") pod \"telemetry-operator-controller-manager-6d84559f47-z9fgs\" (UID: \"6cda5777-cac2-403f-bb77-44e76f6a0806\") " pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.030630 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.032963 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bq8\" (UniqueName: \"kubernetes.io/projected/62f8136e-6a5a-49ac-9c21-621384d4102f-kube-api-access-s2bq8\") pod \"swift-operator-controller-manager-867f54bc44-lmrlz\" (UID: \"62f8136e-6a5a-49ac-9c21-621384d4102f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.034647 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj"] Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.037245 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd46745_f668_4ec4_a61e_7ecf0f91bd8a.slice/crio-c0ae693140301f8b12fc8395ad5534e830dbd9823f1956c84c8e83e94163f759 WatchSource:0}: Error finding container c0ae693140301f8b12fc8395ad5534e830dbd9823f1956c84c8e83e94163f759: Status 404 returned error can't find the container with id c0ae693140301f8b12fc8395ad5534e830dbd9823f1956c84c8e83e94163f759 Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.085692 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109345 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28kw\" (UniqueName: \"kubernetes.io/projected/c5ffe79b-c31d-4a92-8006-aed78a9e1b16-kube-api-access-h28kw\") pod \"watcher-operator-controller-manager-74d6f7b5c-xd6pq\" (UID: \"c5ffe79b-c31d-4a92-8006-aed78a9e1b16\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109409 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlh9\" (UniqueName: \"kubernetes.io/projected/c455d0aa-0b25-4684-997f-b7d2156c2ac8-kube-api-access-cvlh9\") pod \"test-operator-controller-manager-8467ccb4c8-5xlqd\" (UID: \"c455d0aa-0b25-4684-997f-b7d2156c2ac8\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109432 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109475 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109515 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84skv\" (UniqueName: \"kubernetes.io/projected/5718381a-312a-4dab-b896-c865eaa2232b-kube-api-access-84skv\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109536 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.109571 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9g79\" (UniqueName: \"kubernetes.io/projected/64de914e-d3d3-4b6e-b14e-6d02d2891539-kube-api-access-c9g79\") pod \"rabbitmq-cluster-operator-manager-668c99d594-52ml2\" (UID: \"64de914e-d3d3-4b6e-b14e-6d02d2891539\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.110424 4939 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.110531 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert podName:fe507f50-8673-4c7e-84e7-b649c6d61115 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:50.110512908 +0000 UTC m=+1234.709700519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert") pod "infra-operator-controller-manager-5595c7d6ff-hwlg6" (UID: "fe507f50-8673-4c7e-84e7-b649c6d61115") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.127553 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlh9\" (UniqueName: \"kubernetes.io/projected/c455d0aa-0b25-4684-997f-b7d2156c2ac8-kube-api-access-cvlh9\") pod \"test-operator-controller-manager-8467ccb4c8-5xlqd\" (UID: \"c455d0aa-0b25-4684-997f-b7d2156c2ac8\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.142556 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.200681 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" event={"ID":"c2cd123f-732e-4e44-b5f0-f79d15ea87da","Type":"ContainerStarted","Data":"e5509ed6156b33d41070c0acfb632d3ab3cbeb128a51a6c23d46c43fb91e0829"} Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.201936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" event={"ID":"9dd46745-f668-4ec4-a61e-7ecf0f91bd8a","Type":"ContainerStarted","Data":"c0ae693140301f8b12fc8395ad5534e830dbd9823f1956c84c8e83e94163f759"} Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.211196 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84skv\" (UniqueName: \"kubernetes.io/projected/5718381a-312a-4dab-b896-c865eaa2232b-kube-api-access-84skv\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.211558 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.211653 4939 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.211692 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:49.711676247 +0000 UTC m=+1234.310863868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "metrics-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.211878 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9g79\" (UniqueName: \"kubernetes.io/projected/64de914e-d3d3-4b6e-b14e-6d02d2891539-kube-api-access-c9g79\") pod \"rabbitmq-cluster-operator-manager-668c99d594-52ml2\" (UID: \"64de914e-d3d3-4b6e-b14e-6d02d2891539\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.212089 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28kw\" (UniqueName: \"kubernetes.io/projected/c5ffe79b-c31d-4a92-8006-aed78a9e1b16-kube-api-access-h28kw\") pod \"watcher-operator-controller-manager-74d6f7b5c-xd6pq\" (UID: \"c5ffe79b-c31d-4a92-8006-aed78a9e1b16\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.212365 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.212973 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.213013 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:49.713000985 +0000 UTC m=+1234.312188606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.229699 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84skv\" (UniqueName: \"kubernetes.io/projected/5718381a-312a-4dab-b896-c865eaa2232b-kube-api-access-84skv\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.237394 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.237874 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28kw\" (UniqueName: \"kubernetes.io/projected/c5ffe79b-c31d-4a92-8006-aed78a9e1b16-kube-api-access-h28kw\") pod \"watcher-operator-controller-manager-74d6f7b5c-xd6pq\" (UID: \"c5ffe79b-c31d-4a92-8006-aed78a9e1b16\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.250095 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.254549 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9g79\" (UniqueName: \"kubernetes.io/projected/64de914e-d3d3-4b6e-b14e-6d02d2891539-kube-api-access-c9g79\") pod \"rabbitmq-cluster-operator-manager-668c99d594-52ml2\" (UID: \"64de914e-d3d3-4b6e-b14e-6d02d2891539\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.259081 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.315092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.315314 4939 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.315359 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert podName:573a3014-37a6-461d-a211-2fba3e06a72d nodeName:}" failed. No retries permitted until 2026-03-18 15:57:50.315344017 +0000 UTC m=+1234.914531638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" (UID: "573a3014-37a6-461d-a211-2fba3e06a72d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.316934 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.350890 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.397973 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.402469 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.408239 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms"] Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.417409 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd981a7ad_8442_4742_a3be_ca3667fe0f9f.slice/crio-3da983d63b38b4f74e49bc33d8ea5983e67cb7d88c78f4881f17247922b05c75 WatchSource:0}: Error finding container 3da983d63b38b4f74e49bc33d8ea5983e67cb7d88c78f4881f17247922b05c75: Status 404 returned error can't find the container with id 3da983d63b38b4f74e49bc33d8ea5983e67cb7d88c78f4881f17247922b05c75 Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.593051 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.663465 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.682902 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.704031 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.717066 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd"] Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.719660 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4b5f2d_578c_42f1_a283_960b728a9ef7.slice/crio-0338d38cac314c862fd5abd0c8375f1827b469a2146ec509b939b7d61e3f43f4 WatchSource:0}: Error finding container 0338d38cac314c862fd5abd0c8375f1827b469a2146ec509b939b7d61e3f43f4: Status 404 returned error can't find the container with id 0338d38cac314c862fd5abd0c8375f1827b469a2146ec509b939b7d61e3f43f4 Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.719917 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod089530fd_7678_4f52_9e8c_8af0f54a9d15.slice/crio-3ae89da85f25ed54d2005331bde5f57366e876f58cbb690003aae9e105fce34f WatchSource:0}: Error finding container 3ae89da85f25ed54d2005331bde5f57366e876f58cbb690003aae9e105fce34f: Status 404 returned error can't find the container with id 3ae89da85f25ed54d2005331bde5f57366e876f58cbb690003aae9e105fce34f Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.720874 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.720912 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.721033 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.721041 4939 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.721094 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:50.721075685 +0000 UTC m=+1235.320263306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.721111 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:50.721103446 +0000 UTC m=+1235.320291067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "metrics-server-cert" not found Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.724340 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq"] Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.815843 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb8519c_c720_4853_a031_caedb2059b3d.slice/crio-4ffb6d1b2ba77da238fef345d02c6e5f8de1453b4c3915c6965a0988f4e36f5a WatchSource:0}: Error finding container 4ffb6d1b2ba77da238fef345d02c6e5f8de1453b4c3915c6965a0988f4e36f5a: Status 404 returned error can't find the container with id 4ffb6d1b2ba77da238fef345d02c6e5f8de1453b4c3915c6965a0988f4e36f5a Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.816388 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6"] Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.819469 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zsn9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-846c4cdcb7-4xhw6_openstack-operators(0bb8519c-c720-4853-a031-caedb2059b3d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.822027 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" podUID="0bb8519c-c720-4853-a031-caedb2059b3d" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.825050 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc"] Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.831447 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8rjcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-659fb58c6b-z2bvc_openstack-operators(41644021-455b-493c-931f-9bc7fc0b8f8e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.831848 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz"] Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.832764 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" podUID="41644021-455b-493c-931f-9bc7fc0b8f8e" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.847870 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2bq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-867f54bc44-lmrlz_openstack-operators(62f8136e-6a5a-49ac-9c21-621384d4102f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.849104 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" podUID="62f8136e-6a5a-49ac-9c21-621384d4102f" Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.952245 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.960687 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2"] Mar 18 15:57:49 crc kubenswrapper[4939]: I0318 15:57:49.966577 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs"] Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.967716 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64de914e_d3d3_4b6e_b14e_6d02d2891539.slice/crio-5fb6a0ed27aa24ae04b7f309173c20904d086e8dc0367c7c31d3e519bcd09d5d WatchSource:0}: Error finding container 5fb6a0ed27aa24ae04b7f309173c20904d086e8dc0367c7c31d3e519bcd09d5d: Status 404 returned error can't find the container with id 5fb6a0ed27aa24ae04b7f309173c20904d086e8dc0367c7c31d3e519bcd09d5d Mar 18 15:57:49 crc kubenswrapper[4939]: W0318 15:57:49.969608 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc455d0aa_0b25_4684_997f_b7d2156c2ac8.slice/crio-b8e16fc7c0db8ce03fd009eefb129ee65023803542948733109c043eb43cfe01 WatchSource:0}: Error finding container b8e16fc7c0db8ce03fd009eefb129ee65023803542948733109c043eb43cfe01: Status 404 returned error can't find the container with id b8e16fc7c0db8ce03fd009eefb129ee65023803542948733109c043eb43cfe01 Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.973837 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvlh9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-5xlqd_openstack-operators(c455d0aa-0b25-4684-997f-b7d2156c2ac8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.974736 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-42x8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d84559f47-z9fgs_openstack-operators(6cda5777-cac2-403f-bb77-44e76f6a0806): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.975329 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" podUID="c455d0aa-0b25-4684-997f-b7d2156c2ac8" Mar 18 15:57:49 crc kubenswrapper[4939]: E0318 15:57:49.975969 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" podUID="6cda5777-cac2-403f-bb77-44e76f6a0806" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.069208 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq"] Mar 18 15:57:50 crc kubenswrapper[4939]: W0318 15:57:50.076627 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ffe79b_c31d_4a92_8006_aed78a9e1b16.slice/crio-90f7c8a6983c9c47411d84204263af5040fa6e7a9280f4c98cd7a2a3d6c36ecf WatchSource:0}: Error finding container 90f7c8a6983c9c47411d84204263af5040fa6e7a9280f4c98cd7a2a3d6c36ecf: Status 404 returned error can't find the container with id 90f7c8a6983c9c47411d84204263af5040fa6e7a9280f4c98cd7a2a3d6c36ecf Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.133703 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.133854 4939 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.133916 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert podName:fe507f50-8673-4c7e-84e7-b649c6d61115 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:52.133897494 +0000 UTC m=+1236.733085115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert") pod "infra-operator-controller-manager-5595c7d6ff-hwlg6" (UID: "fe507f50-8673-4c7e-84e7-b649c6d61115") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.219712 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" event={"ID":"6cda5777-cac2-403f-bb77-44e76f6a0806","Type":"ContainerStarted","Data":"3a25cf43106bd335af136abe642216cddeebf724f46e5b33c8d9119e6e4a9b38"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.221012 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" event={"ID":"d981a7ad-8442-4742-a3be-ca3667fe0f9f","Type":"ContainerStarted","Data":"3da983d63b38b4f74e49bc33d8ea5983e67cb7d88c78f4881f17247922b05c75"} Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.221359 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" podUID="6cda5777-cac2-403f-bb77-44e76f6a0806" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.225242 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" event={"ID":"41644021-455b-493c-931f-9bc7fc0b8f8e","Type":"ContainerStarted","Data":"06ee51c09ef36ec3694b8295a5bc09ba5604b8f148c91f598e2039ccc4738ecc"} Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.229105 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" podUID="41644021-455b-493c-931f-9bc7fc0b8f8e" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.231323 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" event={"ID":"5f476e97-8e28-48d0-9b5e-79c31bb8e9fd","Type":"ContainerStarted","Data":"bf2a148c6a113526acfa037ab0c6138faaaf4918b58f74cf4a23ec51bde547d8"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.237634 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" event={"ID":"64de914e-d3d3-4b6e-b14e-6d02d2891539","Type":"ContainerStarted","Data":"5fb6a0ed27aa24ae04b7f309173c20904d086e8dc0367c7c31d3e519bcd09d5d"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.241797 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" event={"ID":"23fb02fd-5d5a-488a-b3a7-82c69eb91e32","Type":"ContainerStarted","Data":"94a4f8c1b39246808b702adcf4c38f410563884b68d19f930b05346862d680e9"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.243679 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" event={"ID":"e7737b63-d9f2-45e5-a1f5-b1300aff04a8","Type":"ContainerStarted","Data":"dc30ecfd9d2303772f0ce81eeadb135fdc22ba75491729f1bbb51ee512eabca8"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.250198 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" event={"ID":"089530fd-7678-4f52-9e8c-8af0f54a9d15","Type":"ContainerStarted","Data":"3ae89da85f25ed54d2005331bde5f57366e876f58cbb690003aae9e105fce34f"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.253533 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" event={"ID":"dae9e528-b32e-4cf9-a032-2a3bc38320f6","Type":"ContainerStarted","Data":"6bc9d7183bbb803f2e381fc5a9230ce78966887a4c140dd5e1e55fd66cb89765"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.255745 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" event={"ID":"54f258e9-7e39-422a-b848-2cbfc2633529","Type":"ContainerStarted","Data":"523d70e96905611f20e25180bd6234a2034080694d29e3d27c7aea614b459ccd"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.257181 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" event={"ID":"c5ffe79b-c31d-4a92-8006-aed78a9e1b16","Type":"ContainerStarted","Data":"90f7c8a6983c9c47411d84204263af5040fa6e7a9280f4c98cd7a2a3d6c36ecf"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.258697 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" event={"ID":"e4983266-0ae5-493d-86a4-4f6b010e95d8","Type":"ContainerStarted","Data":"21714a99cb57527d58f8db7c528a7ab57f37e89b1ca43e0882f8a30a22b8dfee"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.261197 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" event={"ID":"3f4b5f2d-578c-42f1-a283-960b728a9ef7","Type":"ContainerStarted","Data":"0338d38cac314c862fd5abd0c8375f1827b469a2146ec509b939b7d61e3f43f4"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.264368 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" event={"ID":"c455d0aa-0b25-4684-997f-b7d2156c2ac8","Type":"ContainerStarted","Data":"b8e16fc7c0db8ce03fd009eefb129ee65023803542948733109c043eb43cfe01"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.268304 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" event={"ID":"0bb8519c-c720-4853-a031-caedb2059b3d","Type":"ContainerStarted","Data":"4ffb6d1b2ba77da238fef345d02c6e5f8de1453b4c3915c6965a0988f4e36f5a"} Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.268861 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" podUID="c455d0aa-0b25-4684-997f-b7d2156c2ac8" Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.269296 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" podUID="0bb8519c-c720-4853-a031-caedb2059b3d" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.269703 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" event={"ID":"ffe00762-ce07-4018-b8d6-290de61644d0","Type":"ContainerStarted","Data":"02798d5c2e3e1b21e145807a6c0ca3490118059a61034cf003fc46cf96438ad6"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.271357 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" event={"ID":"62f8136e-6a5a-49ac-9c21-621384d4102f","Type":"ContainerStarted","Data":"9b7548b49a8b701b979be95187136940a006eff9bbeae810988a9bad79ee3f56"} Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.272687 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" event={"ID":"13e35677-42a2-48c1-86fc-f022708ac217","Type":"ContainerStarted","Data":"67a5ef29c0d0a8a054e87ce3701a40da12cfe9e7cc2accbc012d78eb30e8099b"} Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.273807 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" podUID="62f8136e-6a5a-49ac-9c21-621384d4102f" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.336275 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.336428 4939 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.336472 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert podName:573a3014-37a6-461d-a211-2fba3e06a72d nodeName:}" failed. No retries permitted until 2026-03-18 15:57:52.336458379 +0000 UTC m=+1236.935646000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" (UID: "573a3014-37a6-461d-a211-2fba3e06a72d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.743341 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:50 crc kubenswrapper[4939]: I0318 15:57:50.743392 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.743521 4939 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.743566 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:52.743551276 +0000 UTC m=+1237.342738887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "metrics-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.743859 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:57:50 crc kubenswrapper[4939]: E0318 15:57:50.743884 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:52.743876235 +0000 UTC m=+1237.343063856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:57:51 crc kubenswrapper[4939]: E0318 15:57:51.292170 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" podUID="62f8136e-6a5a-49ac-9c21-621384d4102f" Mar 18 15:57:51 crc kubenswrapper[4939]: E0318 15:57:51.292992 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:88f2db101f619563231cfa13f4488596637731f0ebe33c661d4a5e48a86dd3e8\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" podUID="6cda5777-cac2-403f-bb77-44e76f6a0806" Mar 18 15:57:51 crc kubenswrapper[4939]: E0318 15:57:51.294107 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" podUID="41644021-455b-493c-931f-9bc7fc0b8f8e" Mar 18 15:57:51 crc kubenswrapper[4939]: E0318 15:57:51.294174 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" podUID="0bb8519c-c720-4853-a031-caedb2059b3d" Mar 18 15:57:51 crc kubenswrapper[4939]: E0318 15:57:51.294456 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" podUID="c455d0aa-0b25-4684-997f-b7d2156c2ac8" Mar 18 15:57:52 crc kubenswrapper[4939]: I0318 15:57:52.174976 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.175192 4939 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.175355 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert podName:fe507f50-8673-4c7e-84e7-b649c6d61115 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:56.175339516 +0000 UTC m=+1240.774527137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert") pod "infra-operator-controller-manager-5595c7d6ff-hwlg6" (UID: "fe507f50-8673-4c7e-84e7-b649c6d61115") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: I0318 15:57:52.377605 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.377871 4939 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.377944 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert podName:573a3014-37a6-461d-a211-2fba3e06a72d nodeName:}" failed. No retries permitted until 2026-03-18 15:57:56.377924072 +0000 UTC m=+1240.977111693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" (UID: "573a3014-37a6-461d-a211-2fba3e06a72d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: I0318 15:57:52.782875 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:52 crc kubenswrapper[4939]: I0318 15:57:52.782946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.783051 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.783128 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:56.783108624 +0000 UTC m=+1241.382296245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.783159 4939 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:57:52 crc kubenswrapper[4939]: E0318 15:57:52.783233 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:57:56.783216257 +0000 UTC m=+1241.382403878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "metrics-server-cert" not found Mar 18 15:57:53 crc kubenswrapper[4939]: I0318 15:57:53.687439 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:57:53 crc kubenswrapper[4939]: I0318 15:57:53.687763 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:57:56 crc kubenswrapper[4939]: I0318 15:57:56.243097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.243534 4939 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.244326 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert podName:fe507f50-8673-4c7e-84e7-b649c6d61115 nodeName:}" failed. No retries permitted until 2026-03-18 15:58:04.244301644 +0000 UTC m=+1248.843489265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert") pod "infra-operator-controller-manager-5595c7d6ff-hwlg6" (UID: "fe507f50-8673-4c7e-84e7-b649c6d61115") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: I0318 15:57:56.447209 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.447665 4939 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.447778 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert podName:573a3014-37a6-461d-a211-2fba3e06a72d nodeName:}" failed. No retries permitted until 2026-03-18 15:58:04.447749484 +0000 UTC m=+1249.046937175 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" (UID: "573a3014-37a6-461d-a211-2fba3e06a72d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: I0318 15:57:56.853165 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:56 crc kubenswrapper[4939]: I0318 15:57:56.853231 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.853370 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.853446 4939 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.853476 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:58:04.853445871 +0000 UTC m=+1249.452633502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:57:56 crc kubenswrapper[4939]: E0318 15:57:56.853526 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:58:04.853488972 +0000 UTC m=+1249.452676693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "metrics-server-cert" not found Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.148570 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564158-gb9ms"] Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.149796 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-gb9ms"] Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.149947 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.153396 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.153967 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.157842 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.209878 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wxv\" (UniqueName: \"kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv\") pod \"auto-csr-approver-29564158-gb9ms\" (UID: \"a2982d26-e360-4ad8-a88f-46fd4f87b1eb\") " pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.311038 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wxv\" (UniqueName: \"kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv\") pod \"auto-csr-approver-29564158-gb9ms\" (UID: \"a2982d26-e360-4ad8-a88f-46fd4f87b1eb\") " pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.345660 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wxv\" (UniqueName: \"kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv\") pod \"auto-csr-approver-29564158-gb9ms\" (UID: \"a2982d26-e360-4ad8-a88f-46fd4f87b1eb\") " pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:00 crc kubenswrapper[4939]: I0318 15:58:00.479956 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:03 crc kubenswrapper[4939]: E0318 15:58:03.005034 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 18 15:58:03 crc kubenswrapper[4939]: E0318 15:58:03.006053 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6mzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-wmt6q_openstack-operators(3f4b5f2d-578c-42f1-a283-960b728a9ef7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:03 crc kubenswrapper[4939]: E0318 15:58:03.007735 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" podUID="3f4b5f2d-578c-42f1-a283-960b728a9ef7" Mar 18 15:58:03 crc kubenswrapper[4939]: E0318 15:58:03.443444 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" podUID="3f4b5f2d-578c-42f1-a283-960b728a9ef7" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.253760 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.277228 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe507f50-8673-4c7e-84e7-b649c6d61115-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-hwlg6\" (UID: \"fe507f50-8673-4c7e-84e7-b649c6d61115\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.282463 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.456455 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.464543 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/573a3014-37a6-461d-a211-2fba3e06a72d-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-scblh\" (UID: \"573a3014-37a6-461d-a211-2fba3e06a72d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.554996 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:58:04 crc kubenswrapper[4939]: E0318 15:58:04.815555 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6" Mar 18 15:58:04 crc kubenswrapper[4939]: E0318 15:58:04.815827 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sc9qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6744dd545c-6xvbd_openstack-operators(089530fd-7678-4f52-9e8c-8af0f54a9d15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:04 crc kubenswrapper[4939]: E0318 15:58:04.817144 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" podUID="089530fd-7678-4f52-9e8c-8af0f54a9d15" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.862993 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.863288 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:04 crc kubenswrapper[4939]: E0318 15:58:04.863448 4939 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:58:04 crc kubenswrapper[4939]: E0318 15:58:04.863550 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs podName:5718381a-312a-4dab-b896-c865eaa2232b nodeName:}" failed. No retries permitted until 2026-03-18 15:58:20.863522876 +0000 UTC m=+1265.462710507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs") pod "openstack-operator-controller-manager-65fbdb4fdd-xrcnt" (UID: "5718381a-312a-4dab-b896-c865eaa2232b") : secret "webhook-server-cert" not found Mar 18 15:58:04 crc kubenswrapper[4939]: I0318 15:58:04.868043 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-metrics-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:05 crc kubenswrapper[4939]: E0318 15:58:05.459087 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" podUID="089530fd-7678-4f52-9e8c-8af0f54a9d15" Mar 18 15:58:05 crc kubenswrapper[4939]: E0318 15:58:05.681900 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258" Mar 18 15:58:05 crc kubenswrapper[4939]: E0318 15:58:05.682147 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r2g2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-76b87776c9-992ww_openstack-operators(54f258e9-7e39-422a-b848-2cbfc2633529): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:05 crc kubenswrapper[4939]: E0318 15:58:05.683321 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" podUID="54f258e9-7e39-422a-b848-2cbfc2633529" Mar 18 15:58:06 crc kubenswrapper[4939]: E0318 15:58:06.190478 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 18 15:58:06 crc kubenswrapper[4939]: E0318 15:58:06.190740 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c9g79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-52ml2_openstack-operators(64de914e-d3d3-4b6e-b14e-6d02d2891539): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:06 crc kubenswrapper[4939]: E0318 15:58:06.191986 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" podUID="64de914e-d3d3-4b6e-b14e-6d02d2891539" Mar 18 15:58:06 crc kubenswrapper[4939]: E0318 15:58:06.464071 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" podUID="54f258e9-7e39-422a-b848-2cbfc2633529" Mar 18 15:58:06 crc kubenswrapper[4939]: E0318 15:58:06.464446 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" podUID="64de914e-d3d3-4b6e-b14e-6d02d2891539" Mar 18 15:58:07 crc kubenswrapper[4939]: I0318 15:58:07.868141 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6"] Mar 18 15:58:08 crc kubenswrapper[4939]: I0318 15:58:08.118515 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-gb9ms"] Mar 18 15:58:08 crc kubenswrapper[4939]: W0318 15:58:08.269189 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe507f50_8673_4c7e_84e7_b649c6d61115.slice/crio-4bf0b779c7ea5ba66af4e74ca6ca712ab6884133433672d41ecc72c8ef2bb497 WatchSource:0}: Error finding container 4bf0b779c7ea5ba66af4e74ca6ca712ab6884133433672d41ecc72c8ef2bb497: Status 404 returned error can't find the container with id 4bf0b779c7ea5ba66af4e74ca6ca712ab6884133433672d41ecc72c8ef2bb497 Mar 18 15:58:08 crc kubenswrapper[4939]: W0318 15:58:08.270764 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2982d26_e360_4ad8_a88f_46fd4f87b1eb.slice/crio-638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb WatchSource:0}: Error finding container 638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb: Status 404 returned error can't find the container with id 638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb Mar 18 15:58:08 crc kubenswrapper[4939]: I0318 15:58:08.479392 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" event={"ID":"fe507f50-8673-4c7e-84e7-b649c6d61115","Type":"ContainerStarted","Data":"4bf0b779c7ea5ba66af4e74ca6ca712ab6884133433672d41ecc72c8ef2bb497"} Mar 18 15:58:08 crc kubenswrapper[4939]: I0318 15:58:08.481308 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" event={"ID":"a2982d26-e360-4ad8-a88f-46fd4f87b1eb","Type":"ContainerStarted","Data":"638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb"} Mar 18 15:58:08 crc kubenswrapper[4939]: I0318 15:58:08.686421 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh"] Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.538017 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" event={"ID":"c5ffe79b-c31d-4a92-8006-aed78a9e1b16","Type":"ContainerStarted","Data":"16082c60ac1272000a88c70f453b985b54e143ceb1196f69a8aecef5546fd0cd"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.538949 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.541936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" event={"ID":"c2cd123f-732e-4e44-b5f0-f79d15ea87da","Type":"ContainerStarted","Data":"0eac4bf6d8b5f127b17d9d5d797974c50d4e579c825511cb2201ba6cc87f30fa"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.542005 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.545240 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" event={"ID":"6cda5777-cac2-403f-bb77-44e76f6a0806","Type":"ContainerStarted","Data":"321efa0d03abbfcf8872bc61a4a06aabc83c616ddf9c8d99a118722c2ea540e1"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.545826 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.548143 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" event={"ID":"c455d0aa-0b25-4684-997f-b7d2156c2ac8","Type":"ContainerStarted","Data":"8abffb8901aa32a0bd95758a2d01a83623c52dc211ad4738865d4889cec5b128"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.548666 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.553840 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" event={"ID":"573a3014-37a6-461d-a211-2fba3e06a72d","Type":"ContainerStarted","Data":"cd0ef78299d8dadeed0a4fc02b24f566ba11eb656ea72b785479d6d94fd222ae"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.558708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" event={"ID":"d981a7ad-8442-4742-a3be-ca3667fe0f9f","Type":"ContainerStarted","Data":"bfeef7d57692ca95381271f87d18fbbd0e94d976ae136a0350e5c7acf51e3ca2"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.558850 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.561928 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" event={"ID":"23fb02fd-5d5a-488a-b3a7-82c69eb91e32","Type":"ContainerStarted","Data":"8c777e1f727889af2da8f342443322ee3c05535eb0755b203812a9271b71f338"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.562660 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.573841 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" podStartSLOduration=11.539833722000001 podStartE2EDuration="27.573820342s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:50.078427266 +0000 UTC m=+1234.677614897" lastFinishedPulling="2026-03-18 15:58:06.112413896 +0000 UTC m=+1250.711601517" observedRunningTime="2026-03-18 15:58:15.554377573 +0000 UTC m=+1260.153565194" watchObservedRunningTime="2026-03-18 15:58:15.573820342 +0000 UTC m=+1260.173007963" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.588919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" event={"ID":"13e35677-42a2-48c1-86fc-f022708ac217","Type":"ContainerStarted","Data":"f5649d70ee38c9502892e35b503396ae186a3e7489376607ae07b4df8ab4036e"} Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.589667 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.612970 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" podStartSLOduration=10.553186945 podStartE2EDuration="27.612954699s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.057474379 +0000 UTC m=+1233.656662000" lastFinishedPulling="2026-03-18 15:58:06.117242133 +0000 UTC m=+1250.716429754" observedRunningTime="2026-03-18 15:58:15.584840784 +0000 UTC m=+1260.184028405" watchObservedRunningTime="2026-03-18 15:58:15.612954699 +0000 UTC m=+1260.212142320" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.621957 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" podStartSLOduration=2.492237812 podStartE2EDuration="27.621939012s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.973727767 +0000 UTC m=+1234.572915388" lastFinishedPulling="2026-03-18 15:58:15.103428977 +0000 UTC m=+1259.702616588" observedRunningTime="2026-03-18 15:58:15.610657454 +0000 UTC m=+1260.209845075" watchObservedRunningTime="2026-03-18 15:58:15.621939012 +0000 UTC m=+1260.221126633" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.652794 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" podStartSLOduration=2.584784299 podStartE2EDuration="27.652774834s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.974640583 +0000 UTC m=+1234.573828204" lastFinishedPulling="2026-03-18 15:58:15.042631108 +0000 UTC m=+1259.641818739" observedRunningTime="2026-03-18 15:58:15.651600701 +0000 UTC m=+1260.250788342" watchObservedRunningTime="2026-03-18 15:58:15.652774834 +0000 UTC m=+1260.251962455" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.744464 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" podStartSLOduration=11.046300842 podStartE2EDuration="27.744444385s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.413680697 +0000 UTC m=+1234.012868328" lastFinishedPulling="2026-03-18 15:58:06.11182425 +0000 UTC m=+1250.711011871" observedRunningTime="2026-03-18 15:58:15.684723347 +0000 UTC m=+1260.283910968" watchObservedRunningTime="2026-03-18 15:58:15.744444385 +0000 UTC m=+1260.343632006" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.748038 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" podStartSLOduration=11.314200034 podStartE2EDuration="27.748024296s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.684337027 +0000 UTC m=+1234.283524648" lastFinishedPulling="2026-03-18 15:58:06.118161289 +0000 UTC m=+1250.717348910" observedRunningTime="2026-03-18 15:58:15.716723292 +0000 UTC m=+1260.315910913" watchObservedRunningTime="2026-03-18 15:58:15.748024296 +0000 UTC m=+1260.347211917" Mar 18 15:58:15 crc kubenswrapper[4939]: I0318 15:58:15.760746 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" podStartSLOduration=11.093396193 podStartE2EDuration="27.760728485s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.443616263 +0000 UTC m=+1234.042803884" lastFinishedPulling="2026-03-18 15:58:06.110948555 +0000 UTC m=+1250.710136176" observedRunningTime="2026-03-18 15:58:15.759986184 +0000 UTC m=+1260.359173815" watchObservedRunningTime="2026-03-18 15:58:15.760728485 +0000 UTC m=+1260.359916106" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.616738 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" event={"ID":"5f476e97-8e28-48d0-9b5e-79c31bb8e9fd","Type":"ContainerStarted","Data":"5e3610cb817a70fcabb291f4feb0ceac438e800d116f374bc789b78066838087"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.618195 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.643809 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" event={"ID":"0bb8519c-c720-4853-a031-caedb2059b3d","Type":"ContainerStarted","Data":"ae9c4609e320f280a95443b5395a3944829b33ed114e4e2aefe362bad6176be9"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.644440 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.663765 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" event={"ID":"ffe00762-ce07-4018-b8d6-290de61644d0","Type":"ContainerStarted","Data":"cac757357dae72bfdfa9ab2545d9741460c36a19481ba49bb5ef494c36752e2c"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.664401 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.664456 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" podStartSLOduration=12.278063658 podStartE2EDuration="28.664439259s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.732246691 +0000 UTC m=+1234.331434312" lastFinishedPulling="2026-03-18 15:58:06.118622292 +0000 UTC m=+1250.717809913" observedRunningTime="2026-03-18 15:58:16.660140708 +0000 UTC m=+1261.259328349" watchObservedRunningTime="2026-03-18 15:58:16.664439259 +0000 UTC m=+1261.263626880" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.674696 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" event={"ID":"9dd46745-f668-4ec4-a61e-7ecf0f91bd8a","Type":"ContainerStarted","Data":"04943225750ed06bc737f09dc6b31b1c7d0237220db88112646e6b4e989abddf"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.675392 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.684723 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" event={"ID":"e7737b63-d9f2-45e5-a1f5-b1300aff04a8","Type":"ContainerStarted","Data":"c112f032d669ee27c89972fba13f85724d25232960092e7df2fd180f6ce80ae4"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.685467 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.695603 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" event={"ID":"e4983266-0ae5-493d-86a4-4f6b010e95d8","Type":"ContainerStarted","Data":"de9508d0fb363f3a579653eb67ba49831688043f943c9ecde6edd243e36f2e24"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.695868 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.704350 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" podStartSLOduration=3.481133915 podStartE2EDuration="28.704332707s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.819297792 +0000 UTC m=+1234.418485413" lastFinishedPulling="2026-03-18 15:58:15.042496574 +0000 UTC m=+1259.641684205" observedRunningTime="2026-03-18 15:58:16.702818464 +0000 UTC m=+1261.302006085" watchObservedRunningTime="2026-03-18 15:58:16.704332707 +0000 UTC m=+1261.303520328" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.719764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" event={"ID":"62f8136e-6a5a-49ac-9c21-621384d4102f","Type":"ContainerStarted","Data":"f3569efedfbd3046dfc5840b4c3743ff1b2585d92fb73abf84c20f5630129e77"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.720383 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.735803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" event={"ID":"41644021-455b-493c-931f-9bc7fc0b8f8e","Type":"ContainerStarted","Data":"ede5028e033bdff362525f869d37d7219a3c1e2bfb568a05d0dbbdd808c3dcd5"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.736427 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.741023 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" event={"ID":"dae9e528-b32e-4cf9-a032-2a3bc38320f6","Type":"ContainerStarted","Data":"48239b9794e13e3bd9fb3756da21c03bd049057b7957216ecb9de463164a552a"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.741715 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.755156 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" event={"ID":"a2982d26-e360-4ad8-a88f-46fd4f87b1eb","Type":"ContainerStarted","Data":"10155fbad3a8bff2710db0c1b47a04c483e40e091c2f57c64500ef9c874cb014"} Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.757199 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" podStartSLOduration=12.375129242 podStartE2EDuration="28.75717892s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.728343741 +0000 UTC m=+1234.327531362" lastFinishedPulling="2026-03-18 15:58:06.110393419 +0000 UTC m=+1250.709581040" observedRunningTime="2026-03-18 15:58:16.736742733 +0000 UTC m=+1261.335930354" watchObservedRunningTime="2026-03-18 15:58:16.75717892 +0000 UTC m=+1261.356366551" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.758578 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" podStartSLOduration=12.070337618 podStartE2EDuration="28.75856923s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.423580547 +0000 UTC m=+1234.022768178" lastFinishedPulling="2026-03-18 15:58:06.111812169 +0000 UTC m=+1250.710999790" observedRunningTime="2026-03-18 15:58:16.750142361 +0000 UTC m=+1261.349329982" watchObservedRunningTime="2026-03-18 15:58:16.75856923 +0000 UTC m=+1261.357756851" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.777406 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" podStartSLOduration=11.72265178 podStartE2EDuration="28.777382561s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.057082468 +0000 UTC m=+1233.656270089" lastFinishedPulling="2026-03-18 15:58:06.111813249 +0000 UTC m=+1250.711000870" observedRunningTime="2026-03-18 15:58:16.765252689 +0000 UTC m=+1261.364440310" watchObservedRunningTime="2026-03-18 15:58:16.777382561 +0000 UTC m=+1261.376570182" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.785042 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" podStartSLOduration=11.957383514 podStartE2EDuration="28.785026527s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.282663134 +0000 UTC m=+1233.881850745" lastFinishedPulling="2026-03-18 15:58:06.110306137 +0000 UTC m=+1250.709493758" observedRunningTime="2026-03-18 15:58:16.78371329 +0000 UTC m=+1261.382900921" watchObservedRunningTime="2026-03-18 15:58:16.785026527 +0000 UTC m=+1261.384214148" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.828791 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" podStartSLOduration=12.008759177 podStartE2EDuration="28.828772294s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.292294286 +0000 UTC m=+1233.891481907" lastFinishedPulling="2026-03-18 15:58:06.112307403 +0000 UTC m=+1250.711495024" observedRunningTime="2026-03-18 15:58:16.816552329 +0000 UTC m=+1261.415739950" watchObservedRunningTime="2026-03-18 15:58:16.828772294 +0000 UTC m=+1261.427959915" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.840686 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" podStartSLOduration=14.187813197 podStartE2EDuration="16.84066796s" podCreationTimestamp="2026-03-18 15:58:00 +0000 UTC" firstStartedPulling="2026-03-18 15:58:13.22984988 +0000 UTC m=+1257.829037511" lastFinishedPulling="2026-03-18 15:58:15.882704653 +0000 UTC m=+1260.481892274" observedRunningTime="2026-03-18 15:58:16.840170916 +0000 UTC m=+1261.439358537" watchObservedRunningTime="2026-03-18 15:58:16.84066796 +0000 UTC m=+1261.439855571" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.859303 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" podStartSLOduration=3.664362303 podStartE2EDuration="28.859287466s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.847570091 +0000 UTC m=+1234.446757712" lastFinishedPulling="2026-03-18 15:58:15.042495254 +0000 UTC m=+1259.641682875" observedRunningTime="2026-03-18 15:58:16.858585117 +0000 UTC m=+1261.457772748" watchObservedRunningTime="2026-03-18 15:58:16.859287466 +0000 UTC m=+1261.458475087" Mar 18 15:58:16 crc kubenswrapper[4939]: I0318 15:58:16.878518 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" podStartSLOduration=3.631588596 podStartE2EDuration="28.878488039s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.831310401 +0000 UTC m=+1234.430498032" lastFinishedPulling="2026-03-18 15:58:15.078209854 +0000 UTC m=+1259.677397475" observedRunningTime="2026-03-18 15:58:16.87463229 +0000 UTC m=+1261.473819911" watchObservedRunningTime="2026-03-18 15:58:16.878488039 +0000 UTC m=+1261.477675660" Mar 18 15:58:17 crc kubenswrapper[4939]: I0318 15:58:17.760906 4939 generic.go:334] "Generic (PLEG): container finished" podID="a2982d26-e360-4ad8-a88f-46fd4f87b1eb" containerID="10155fbad3a8bff2710db0c1b47a04c483e40e091c2f57c64500ef9c874cb014" exitCode=0 Mar 18 15:58:17 crc kubenswrapper[4939]: I0318 15:58:17.760999 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" event={"ID":"a2982d26-e360-4ad8-a88f-46fd4f87b1eb","Type":"ContainerDied","Data":"10155fbad3a8bff2710db0c1b47a04c483e40e091c2f57c64500ef9c874cb014"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.053568 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.154493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6wxv\" (UniqueName: \"kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv\") pod \"a2982d26-e360-4ad8-a88f-46fd4f87b1eb\" (UID: \"a2982d26-e360-4ad8-a88f-46fd4f87b1eb\") " Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.163467 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv" (OuterVolumeSpecName: "kube-api-access-n6wxv") pod "a2982d26-e360-4ad8-a88f-46fd4f87b1eb" (UID: "a2982d26-e360-4ad8-a88f-46fd4f87b1eb"). InnerVolumeSpecName "kube-api-access-n6wxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.256883 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6wxv\" (UniqueName: \"kubernetes.io/projected/a2982d26-e360-4ad8-a88f-46fd4f87b1eb-kube-api-access-n6wxv\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.264066 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-7cfwq"] Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.272663 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-7cfwq"] Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.780563 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" event={"ID":"a2982d26-e360-4ad8-a88f-46fd4f87b1eb","Type":"ContainerDied","Data":"638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.780898 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638c359f7d9a2d13f3dba767e685ca2178330fae8dc86b881ae905c628a190fb" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.780823 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-gb9ms" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.782445 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" event={"ID":"573a3014-37a6-461d-a211-2fba3e06a72d","Type":"ContainerStarted","Data":"05ab01805ad2373ea8a9bdbf0bd9bf7a0c5ff42de1911191ea0220ee698307f9"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.782825 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.784098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" event={"ID":"54f258e9-7e39-422a-b848-2cbfc2633529","Type":"ContainerStarted","Data":"d94ff316faae0a0b50d3b61879931602812c566d865988939f3bc6dc687845e8"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.784458 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.789788 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" event={"ID":"3f4b5f2d-578c-42f1-a283-960b728a9ef7","Type":"ContainerStarted","Data":"c86b41e6fb1aab5f764ffa623c270e19ce704eaacc0e20e86a56139b95f7d24c"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.790097 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.792277 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" event={"ID":"fe507f50-8673-4c7e-84e7-b649c6d61115","Type":"ContainerStarted","Data":"290058f1f3cda7a30894378970544980ebf07b142325476ab1c4ee6e686105ca"} Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.793156 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.822547 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" podStartSLOduration=28.065654824 podStartE2EDuration="31.822529802s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:58:14.9988324 +0000 UTC m=+1259.598020021" lastFinishedPulling="2026-03-18 15:58:18.755707388 +0000 UTC m=+1263.354894999" observedRunningTime="2026-03-18 15:58:19.820349981 +0000 UTC m=+1264.419537602" watchObservedRunningTime="2026-03-18 15:58:19.822529802 +0000 UTC m=+1264.421717423" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.842025 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" podStartSLOduration=21.35897981 podStartE2EDuration="31.842009753s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:58:08.272667075 +0000 UTC m=+1252.871854696" lastFinishedPulling="2026-03-18 15:58:18.755697028 +0000 UTC m=+1263.354884639" observedRunningTime="2026-03-18 15:58:19.837099244 +0000 UTC m=+1264.436286885" watchObservedRunningTime="2026-03-18 15:58:19.842009753 +0000 UTC m=+1264.441197374" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.855899 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" podStartSLOduration=2.703573267 podStartE2EDuration="31.855879235s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.604359467 +0000 UTC m=+1234.203547078" lastFinishedPulling="2026-03-18 15:58:18.756665425 +0000 UTC m=+1263.355853046" observedRunningTime="2026-03-18 15:58:19.850681388 +0000 UTC m=+1264.449869009" watchObservedRunningTime="2026-03-18 15:58:19.855879235 +0000 UTC m=+1264.455066856" Mar 18 15:58:19 crc kubenswrapper[4939]: I0318 15:58:19.865329 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" podStartSLOduration=2.823992779 podStartE2EDuration="31.865310961s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.717424752 +0000 UTC m=+1234.316612373" lastFinishedPulling="2026-03-18 15:58:18.758742934 +0000 UTC m=+1263.357930555" observedRunningTime="2026-03-18 15:58:19.861687999 +0000 UTC m=+1264.460875620" watchObservedRunningTime="2026-03-18 15:58:19.865310961 +0000 UTC m=+1264.464498582" Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.146385 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd11a960-b0cf-4dee-b4bd-46ef351172d1" path="/var/lib/kubelet/pods/bd11a960-b0cf-4dee-b4bd-46ef351172d1/volumes" Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.801381 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" event={"ID":"089530fd-7678-4f52-9e8c-8af0f54a9d15","Type":"ContainerStarted","Data":"1eea877d6318f7b59622861d594760e5c14cb486b97b6897173d17e4e9f5e775"} Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.801563 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.802971 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" event={"ID":"64de914e-d3d3-4b6e-b14e-6d02d2891539","Type":"ContainerStarted","Data":"ef3419f7942316f97d870337b77d7d02f94aa059d6d20960a43d4ea1f263507f"} Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.823065 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" podStartSLOduration=2.890500229 podStartE2EDuration="32.823049312s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.723649838 +0000 UTC m=+1234.322837459" lastFinishedPulling="2026-03-18 15:58:19.656198911 +0000 UTC m=+1264.255386542" observedRunningTime="2026-03-18 15:58:20.81979238 +0000 UTC m=+1265.418980021" watchObservedRunningTime="2026-03-18 15:58:20.823049312 +0000 UTC m=+1265.422236933" Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.890936 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:20 crc kubenswrapper[4939]: I0318 15:58:20.900208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5718381a-312a-4dab-b896-c865eaa2232b-webhook-certs\") pod \"openstack-operator-controller-manager-65fbdb4fdd-xrcnt\" (UID: \"5718381a-312a-4dab-b896-c865eaa2232b\") " pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.136092 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-g9gwk" Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.145324 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.603235 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-52ml2" podStartSLOduration=3.9156903549999997 podStartE2EDuration="33.603208772s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="2026-03-18 15:57:49.969790835 +0000 UTC m=+1234.568978456" lastFinishedPulling="2026-03-18 15:58:19.657309242 +0000 UTC m=+1264.256496873" observedRunningTime="2026-03-18 15:58:20.848148221 +0000 UTC m=+1265.447335862" watchObservedRunningTime="2026-03-18 15:58:21.603208772 +0000 UTC m=+1266.202396413" Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.606874 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt"] Mar 18 15:58:21 crc kubenswrapper[4939]: W0318 15:58:21.610958 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5718381a_312a_4dab_b896_c865eaa2232b.slice/crio-a5d5e96e1d03729a872962e2463bd31227393395e50111cca7d49953a0b15606 WatchSource:0}: Error finding container a5d5e96e1d03729a872962e2463bd31227393395e50111cca7d49953a0b15606: Status 404 returned error can't find the container with id a5d5e96e1d03729a872962e2463bd31227393395e50111cca7d49953a0b15606 Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.812106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" event={"ID":"5718381a-312a-4dab-b896-c865eaa2232b","Type":"ContainerStarted","Data":"f8f1539a4c5e88fa09c045a76fb635de9aa7d2e946727fa514d1eecd43d0bb0e"} Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.812162 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" event={"ID":"5718381a-312a-4dab-b896-c865eaa2232b","Type":"ContainerStarted","Data":"a5d5e96e1d03729a872962e2463bd31227393395e50111cca7d49953a0b15606"} Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.812818 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:21 crc kubenswrapper[4939]: I0318 15:58:21.838848 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" podStartSLOduration=33.838832492 podStartE2EDuration="33.838832492s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:21.833569893 +0000 UTC m=+1266.432757524" watchObservedRunningTime="2026-03-18 15:58:21.838832492 +0000 UTC m=+1266.438020113" Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.687192 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.687629 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.687802 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.688838 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.688914 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa" gracePeriod=600 Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.839863 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa" exitCode=0 Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.839978 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa"} Mar 18 15:58:23 crc kubenswrapper[4939]: I0318 15:58:23.840335 4939 scope.go:117] "RemoveContainer" containerID="3e0c9746e62b2cdabcce5b37ee8d4d9dea82e474357d37ce8b290ec31b2fa0e2" Mar 18 15:58:24 crc kubenswrapper[4939]: I0318 15:58:24.288617 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-hwlg6" Mar 18 15:58:24 crc kubenswrapper[4939]: I0318 15:58:24.562086 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-scblh" Mar 18 15:58:24 crc kubenswrapper[4939]: I0318 15:58:24.848406 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61"} Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.475058 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-9mhmj" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.510279 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-wvfdv" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.559789 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-pfmz6" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.585302 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-xlmmb" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.600871 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-2w7ms" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.665557 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-djxqp" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.700972 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-78b9l" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.724187 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-992ww" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.777881 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-bpt6p" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.792213 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-xjg7j" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.867268 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-6xvbd" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.912002 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wmt6q" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.969240 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-hzsbq" Mar 18 15:58:28 crc kubenswrapper[4939]: I0318 15:58:28.984164 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-4xhw6" Mar 18 15:58:29 crc kubenswrapper[4939]: I0318 15:58:29.013167 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-z2bvc" Mar 18 15:58:29 crc kubenswrapper[4939]: I0318 15:58:29.089659 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-lmrlz" Mar 18 15:58:29 crc kubenswrapper[4939]: I0318 15:58:29.145905 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d84559f47-z9fgs" Mar 18 15:58:29 crc kubenswrapper[4939]: I0318 15:58:29.240274 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-5xlqd" Mar 18 15:58:29 crc kubenswrapper[4939]: I0318 15:58:29.320325 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-xd6pq" Mar 18 15:58:31 crc kubenswrapper[4939]: I0318 15:58:31.155424 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65fbdb4fdd-xrcnt" Mar 18 15:58:38 crc kubenswrapper[4939]: I0318 15:58:38.008177 4939 scope.go:117] "RemoveContainer" containerID="dfcc60cf5eb04f85f77cdd1a79a89dc2f8ceac0b67bb2ded8f01075774c5ff17" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.983162 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:58:45 crc kubenswrapper[4939]: E0318 15:58:45.984322 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2982d26-e360-4ad8-a88f-46fd4f87b1eb" containerName="oc" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.984339 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2982d26-e360-4ad8-a88f-46fd4f87b1eb" containerName="oc" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.984507 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2982d26-e360-4ad8-a88f-46fd4f87b1eb" containerName="oc" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.985388 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.987819 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.987873 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.987995 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7hg7p" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.988094 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 15:58:45 crc kubenswrapper[4939]: I0318 15:58:45.994414 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.017657 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.017723 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwnh\" (UniqueName: \"kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.054740 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.055955 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.058173 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.066067 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.119784 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4ll\" (UniqueName: \"kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.119869 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.119924 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.119964 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.119996 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwnh\" (UniqueName: \"kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.121150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.147911 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwnh\" (UniqueName: \"kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh\") pod \"dnsmasq-dns-675f4bcbfc-tg244\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.221464 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4ll\" (UniqueName: \"kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.221590 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.221649 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.222770 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.222897 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.241578 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4ll\" (UniqueName: \"kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll\") pod \"dnsmasq-dns-78dd6ddcc-kbfrn\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.306451 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.376924 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.728689 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:58:46 crc kubenswrapper[4939]: I0318 15:58:46.813465 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:58:46 crc kubenswrapper[4939]: W0318 15:58:46.816709 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474abbfb_42dd_4854_a29a_253ce5e1f0f6.slice/crio-90eaf19c65093f892c82a19b6e8296fc32900664aa551cbde5e43627dd6f2634 WatchSource:0}: Error finding container 90eaf19c65093f892c82a19b6e8296fc32900664aa551cbde5e43627dd6f2634: Status 404 returned error can't find the container with id 90eaf19c65093f892c82a19b6e8296fc32900664aa551cbde5e43627dd6f2634 Mar 18 15:58:47 crc kubenswrapper[4939]: I0318 15:58:47.035394 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" event={"ID":"bf8341b6-bee9-498f-a891-edbde6ebce5b","Type":"ContainerStarted","Data":"1656ae6651f2f013081d14746474d8261b8640048f1fccefbd93eebfe87d01c3"} Mar 18 15:58:47 crc kubenswrapper[4939]: I0318 15:58:47.036722 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" event={"ID":"474abbfb-42dd-4854-a29a-253ce5e1f0f6","Type":"ContainerStarted","Data":"90eaf19c65093f892c82a19b6e8296fc32900664aa551cbde5e43627dd6f2634"} Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.048981 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.085294 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.086484 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.097256 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.164096 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.164156 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gzw\" (UniqueName: \"kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.164189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.265092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.265152 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gzw\" (UniqueName: \"kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.265189 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.266480 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.266489 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.291624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gzw\" (UniqueName: \"kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw\") pod \"dnsmasq-dns-5ccc8479f9-vtfqn\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.352572 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.411070 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.412459 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.413423 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.429420 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.478298 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8msn\" (UniqueName: \"kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.478400 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.478460 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.579276 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.579363 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.579424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8msn\" (UniqueName: \"kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.580604 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.581691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.624716 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8msn\" (UniqueName: \"kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn\") pod \"dnsmasq-dns-57d769cc4f-8wfpp\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:49 crc kubenswrapper[4939]: I0318 15:58:49.740899 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.238758 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.240103 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.245306 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.251734 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.253598 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.253839 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.253988 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.254125 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.254234 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.254428 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qsv45" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392366 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392422 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392446 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392463 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392561 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392582 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392600 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392629 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392655 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392671 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn9t5\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.392698 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494313 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494369 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494395 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494422 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494461 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494483 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn9t5\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494637 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494667 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.494688 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.495281 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.496002 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.496120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.496895 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.497810 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.497934 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.499202 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.500085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.500625 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.501974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.518014 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn9t5\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.524075 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.555930 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.557416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.560017 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.562341 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.562583 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.563554 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.563983 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.565657 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.565831 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kpp89" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.577600 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.592443 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701064 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701105 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjst\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701140 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701157 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701177 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701197 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701258 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701360 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701488 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701603 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.701705 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803573 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803635 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803675 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803769 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803791 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjst\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803827 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803844 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803866 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803934 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.803954 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.804100 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.804686 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.805449 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.806118 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.806806 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.807112 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.808164 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.808382 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.810932 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.822601 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.822962 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.831403 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjst\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst\") pod \"rabbitmq-server-0\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " pod="openstack/rabbitmq-server-0" Mar 18 15:58:50 crc kubenswrapper[4939]: I0318 15:58:50.890402 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.503095 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.506462 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.510170 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.512100 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6942g" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.515531 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.515601 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.519206 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.533493 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620245 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620306 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620328 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620598 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620854 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxfw\" (UniqueName: \"kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.620998 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722591 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722652 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxfw\" (UniqueName: \"kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722687 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722738 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722768 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722785 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722847 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.722882 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.723129 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.723954 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.723969 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.724139 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.726114 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.729631 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.740372 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.749765 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.756865 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxfw\" (UniqueName: \"kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw\") pod \"openstack-galera-0\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " pod="openstack/openstack-galera-0" Mar 18 15:58:51 crc kubenswrapper[4939]: I0318 15:58:51.841608 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.838494 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.841302 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.846057 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.846925 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.847073 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5pfng" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.847182 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.852565 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.940823 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.940901 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtfg\" (UniqueName: \"kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941210 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941238 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941262 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941288 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:52 crc kubenswrapper[4939]: I0318 15:58:52.941317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.042730 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.042964 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043005 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043168 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043514 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043620 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtfg\" (UniqueName: \"kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043702 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043712 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.043953 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.044036 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.044578 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.044727 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.048048 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.049299 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.049804 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.064184 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtfg\" (UniqueName: \"kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.079943 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.107897 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.108992 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.112440 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.112901 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-75fkl" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.113359 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.124074 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.174928 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.253591 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.253658 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.253697 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.253790 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.253815 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28v9\" (UniqueName: \"kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.355060 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.355190 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.355226 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28v9\" (UniqueName: \"kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.355292 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.355323 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.356037 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.356489 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.358542 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.365357 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.377349 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28v9\" (UniqueName: \"kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9\") pod \"memcached-0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " pod="openstack/memcached-0" Mar 18 15:58:53 crc kubenswrapper[4939]: I0318 15:58:53.429010 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.131827 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.132884 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.135942 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bdcbt" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.145004 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.288921 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2m88\" (UniqueName: \"kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88\") pod \"kube-state-metrics-0\" (UID: \"36eb8417-2815-4eea-987e-d05be6eb90e9\") " pod="openstack/kube-state-metrics-0" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.390444 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2m88\" (UniqueName: \"kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88\") pod \"kube-state-metrics-0\" (UID: \"36eb8417-2815-4eea-987e-d05be6eb90e9\") " pod="openstack/kube-state-metrics-0" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.413014 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2m88\" (UniqueName: \"kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88\") pod \"kube-state-metrics-0\" (UID: \"36eb8417-2815-4eea-987e-d05be6eb90e9\") " pod="openstack/kube-state-metrics-0" Mar 18 15:58:55 crc kubenswrapper[4939]: I0318 15:58:55.454573 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.155111 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.157746 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.162270 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.163807 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.163953 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hgnpg" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.197651 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.204797 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.206602 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.242947 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.244807 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnw5\" (UniqueName: \"kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.244876 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.244980 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.245027 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.245049 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.245148 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.245217 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346747 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnw5\" (UniqueName: \"kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346797 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346832 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346854 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346880 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5vn\" (UniqueName: \"kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.346906 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347018 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347052 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347088 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347172 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347212 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347265 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347315 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347622 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.347755 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.348694 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.351399 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.359170 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.359211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.361640 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnw5\" (UniqueName: \"kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5\") pod \"ovn-controller-mp2sj\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449217 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449264 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5vn\" (UniqueName: \"kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449348 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449382 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449415 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449700 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.450016 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.449855 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.451242 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.475271 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5vn\" (UniqueName: \"kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn\") pod \"ovn-controller-ovs-56pdq\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.486011 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj" Mar 18 15:58:58 crc kubenswrapper[4939]: I0318 15:58:58.527072 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.871630 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.872971 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.874926 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.874984 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.876039 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.881883 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.881884 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-w5jhq" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.888270 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972206 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972254 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9wk\" (UniqueName: \"kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972296 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972321 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972349 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972377 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972394 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:58:59 crc kubenswrapper[4939]: I0318 15:58:59.972435 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073743 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073815 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073845 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9wk\" (UniqueName: \"kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073890 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073911 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073933 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073961 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.073975 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.074570 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.074808 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.075091 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.075311 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.079758 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.083127 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.086046 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.091988 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9wk\" (UniqueName: \"kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.098825 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: I0318 15:59:00.191378 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:00 crc kubenswrapper[4939]: E0318 15:59:00.932109 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:59:00 crc kubenswrapper[4939]: E0318 15:59:00.932458 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wwnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-tg244_openstack(bf8341b6-bee9-498f-a891-edbde6ebce5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:59:00 crc kubenswrapper[4939]: E0318 15:59:00.934106 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" podUID="bf8341b6-bee9-498f-a891-edbde6ebce5b" Mar 18 15:59:00 crc kubenswrapper[4939]: E0318 15:59:00.998092 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:59:00 crc kubenswrapper[4939]: E0318 15:59:00.998247 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rr4ll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-kbfrn_openstack(474abbfb-42dd-4854-a29a-253ce5e1f0f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:59:01 crc kubenswrapper[4939]: E0318 15:59:00.999633 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" podUID="474abbfb-42dd-4854-a29a-253ce5e1f0f6" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.556137 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.591604 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.789057 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.796888 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.921313 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config\") pod \"bf8341b6-bee9-498f-a891-edbde6ebce5b\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.921358 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4ll\" (UniqueName: \"kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll\") pod \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.921402 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc\") pod \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.921425 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wwnh\" (UniqueName: \"kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh\") pod \"bf8341b6-bee9-498f-a891-edbde6ebce5b\" (UID: \"bf8341b6-bee9-498f-a891-edbde6ebce5b\") " Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.921457 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config\") pod \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\" (UID: \"474abbfb-42dd-4854-a29a-253ce5e1f0f6\") " Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.922492 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config" (OuterVolumeSpecName: "config") pod "474abbfb-42dd-4854-a29a-253ce5e1f0f6" (UID: "474abbfb-42dd-4854-a29a-253ce5e1f0f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.922885 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config" (OuterVolumeSpecName: "config") pod "bf8341b6-bee9-498f-a891-edbde6ebce5b" (UID: "bf8341b6-bee9-498f-a891-edbde6ebce5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.926365 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "474abbfb-42dd-4854-a29a-253ce5e1f0f6" (UID: "474abbfb-42dd-4854-a29a-253ce5e1f0f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.930487 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll" (OuterVolumeSpecName: "kube-api-access-rr4ll") pod "474abbfb-42dd-4854-a29a-253ce5e1f0f6" (UID: "474abbfb-42dd-4854-a29a-253ce5e1f0f6"). InnerVolumeSpecName "kube-api-access-rr4ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.930827 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh" (OuterVolumeSpecName: "kube-api-access-5wwnh") pod "bf8341b6-bee9-498f-a891-edbde6ebce5b" (UID: "bf8341b6-bee9-498f-a891-edbde6ebce5b"). InnerVolumeSpecName "kube-api-access-5wwnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.946369 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.966371 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.985995 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:59:01 crc kubenswrapper[4939]: I0318 15:59:01.993044 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 15:59:02 crc kubenswrapper[4939]: W0318 15:59:02.000653 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36eb8417_2815_4eea_987e_d05be6eb90e9.slice/crio-72eae1310188b05753171d0524380b41e47b879d10033b781ab8ad8a8f966dbe WatchSource:0}: Error finding container 72eae1310188b05753171d0524380b41e47b879d10033b781ab8ad8a8f966dbe: Status 404 returned error can't find the container with id 72eae1310188b05753171d0524380b41e47b879d10033b781ab8ad8a8f966dbe Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.002570 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.009461 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: W0318 15:59:02.011485 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d3941f5_14fb_4ed6_a715_d4b99cb0961c.slice/crio-dd0d06407d5abc3d9dcf06ed9d0fdbf472198b87c616523ad72d863e2e02762b WatchSource:0}: Error finding container dd0d06407d5abc3d9dcf06ed9d0fdbf472198b87c616523ad72d863e2e02762b: Status 404 returned error can't find the container with id dd0d06407d5abc3d9dcf06ed9d0fdbf472198b87c616523ad72d863e2e02762b Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.014500 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.023862 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8341b6-bee9-498f-a891-edbde6ebce5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.023884 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4ll\" (UniqueName: \"kubernetes.io/projected/474abbfb-42dd-4854-a29a-253ce5e1f0f6-kube-api-access-rr4ll\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.023919 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.023928 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wwnh\" (UniqueName: \"kubernetes.io/projected/bf8341b6-bee9-498f-a891-edbde6ebce5b-kube-api-access-5wwnh\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.023936 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474abbfb-42dd-4854-a29a-253ce5e1f0f6-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.081763 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.244192 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerStarted","Data":"74b89c6b5b5ee7263a5ce777c34ae6ab461fca0d1cbf245329845c5513760fd1"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.259848 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerStarted","Data":"17fe9c4329dc0500a1476d7e6d0a87e8b93a83625230903d3cac31c71742ebfa"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.285468 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36eb8417-2815-4eea-987e-d05be6eb90e9","Type":"ContainerStarted","Data":"72eae1310188b05753171d0524380b41e47b879d10033b781ab8ad8a8f966dbe"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.311749 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" event={"ID":"c4268347-7fb1-4cd7-9354-4d7babd5ce94","Type":"ContainerStarted","Data":"78efc61b71dabc2f4db549d019a7a66a1635c1f54517c7dc601ea920dfe321bc"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.317130 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.318466 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.324576 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" event={"ID":"474abbfb-42dd-4854-a29a-253ce5e1f0f6","Type":"ContainerDied","Data":"90eaf19c65093f892c82a19b6e8296fc32900664aa551cbde5e43627dd6f2634"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.324714 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kbfrn" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.329933 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.330582 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2rsmj" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.342355 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.342557 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.343040 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.357743 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj" event={"ID":"1d3941f5-14fb-4ed6-a715-d4b99cb0961c","Type":"ContainerStarted","Data":"dd0d06407d5abc3d9dcf06ed9d0fdbf472198b87c616523ad72d863e2e02762b"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.385328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"658fa1ab-1e7e-42d2-947e-6c74215e15f0","Type":"ContainerStarted","Data":"94be137c43eea6a3392fa2ceb82cc16b626706fa6e74299e1194f94d3c664388"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.392161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerStarted","Data":"758c253bcb3f90ef736746699cb1d3004b2714782bae77fd585422668a088008"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.397038 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" event={"ID":"73e35ccc-8cdd-45f3-ac02-aa04dd395d50","Type":"ContainerStarted","Data":"e50e97b08ab67450ffb2721880461bf78aa6e47f187d5af44265867638778d24"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.408216 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerStarted","Data":"74e9d84fe008f61823cf0755b4ac84a2475bf7351e7b4c5c02867f22a7b133d4"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.414637 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" event={"ID":"bf8341b6-bee9-498f-a891-edbde6ebce5b","Type":"ContainerDied","Data":"1656ae6651f2f013081d14746474d8261b8640048f1fccefbd93eebfe87d01c3"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.414774 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tg244" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.418778 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerStarted","Data":"4aab8b140c6c62d51bda2069f7bca7427b5d1cd43b25057ff4df8994f7ac04fd"} Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.428636 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.432895 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433138 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433246 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433399 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zrl\" (UniqueName: \"kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433486 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433607 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433710 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.433796 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.437147 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kbfrn"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.473256 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.480218 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tg244"] Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535489 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zrl\" (UniqueName: \"kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535577 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535600 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535620 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535666 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535686 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.535702 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.536967 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.537582 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.537803 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.538142 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.540695 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.541793 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.542015 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.554397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.554926 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zrl\" (UniqueName: \"kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl\") pod \"ovsdbserver-sb-0\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.686565 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:02 crc kubenswrapper[4939]: I0318 15:59:02.751005 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 15:59:02 crc kubenswrapper[4939]: W0318 15:59:02.793417 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f68996_05bc_4432_ac98_c730b09c6288.slice/crio-488381bd6e03e3e2e4c448211b41f2111b278f38e3ca8a61d769d6a6cf95a0f6 WatchSource:0}: Error finding container 488381bd6e03e3e2e4c448211b41f2111b278f38e3ca8a61d769d6a6cf95a0f6: Status 404 returned error can't find the container with id 488381bd6e03e3e2e4c448211b41f2111b278f38e3ca8a61d769d6a6cf95a0f6 Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.414180 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.427061 4939 generic.go:334] "Generic (PLEG): container finished" podID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerID="3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a" exitCode=0 Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.427119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" event={"ID":"c4268347-7fb1-4cd7-9354-4d7babd5ce94","Type":"ContainerDied","Data":"3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a"} Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.431708 4939 generic.go:334] "Generic (PLEG): container finished" podID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerID="a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85" exitCode=0 Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.431755 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" event={"ID":"73e35ccc-8cdd-45f3-ac02-aa04dd395d50","Type":"ContainerDied","Data":"a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85"} Mar 18 15:59:03 crc kubenswrapper[4939]: I0318 15:59:03.434842 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerStarted","Data":"488381bd6e03e3e2e4c448211b41f2111b278f38e3ca8a61d769d6a6cf95a0f6"} Mar 18 15:59:03 crc kubenswrapper[4939]: W0318 15:59:03.876265 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e964ed3_1c22_4d0b_b6eb_45df177b2f33.slice/crio-32bf57067fcb26420b62eb765a24b8522a5894c707ee084079247d1fa428e5ab WatchSource:0}: Error finding container 32bf57067fcb26420b62eb765a24b8522a5894c707ee084079247d1fa428e5ab: Status 404 returned error can't find the container with id 32bf57067fcb26420b62eb765a24b8522a5894c707ee084079247d1fa428e5ab Mar 18 15:59:04 crc kubenswrapper[4939]: I0318 15:59:04.145707 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474abbfb-42dd-4854-a29a-253ce5e1f0f6" path="/var/lib/kubelet/pods/474abbfb-42dd-4854-a29a-253ce5e1f0f6/volumes" Mar 18 15:59:04 crc kubenswrapper[4939]: I0318 15:59:04.146384 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8341b6-bee9-498f-a891-edbde6ebce5b" path="/var/lib/kubelet/pods/bf8341b6-bee9-498f-a891-edbde6ebce5b/volumes" Mar 18 15:59:04 crc kubenswrapper[4939]: I0318 15:59:04.443729 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerStarted","Data":"32bf57067fcb26420b62eb765a24b8522a5894c707ee084079247d1fa428e5ab"} Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.811205 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.812633 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.816050 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.846116 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895475 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895554 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4zdt\" (UniqueName: \"kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895644 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895708 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.895822 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.993427 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997055 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997141 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997173 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4zdt\" (UniqueName: \"kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997220 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997246 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.997270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.998394 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.999035 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:09 crc kubenswrapper[4939]: I0318 15:59:09.999093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.003191 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.008196 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.020029 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4zdt\" (UniqueName: \"kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt\") pod \"ovn-controller-metrics-w8b45\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.023861 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.025178 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.029845 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.039041 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.098754 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.098814 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxwm\" (UniqueName: \"kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.098848 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.098955 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.163513 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8b45" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.192768 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.200062 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.200366 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxwm\" (UniqueName: \"kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.200485 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.200630 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.201053 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.201731 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.202200 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.222142 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxwm\" (UniqueName: \"kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm\") pod \"dnsmasq-dns-7fd796d7df-qw9r2\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.222775 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.224317 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.226212 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.237969 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.302205 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.302264 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.302312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.302423 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.302451 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d69f\" (UniqueName: \"kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.399179 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.404377 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.406480 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.406562 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d69f\" (UniqueName: \"kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.406766 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.406793 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.406818 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.407961 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.408367 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.409229 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.447537 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d69f\" (UniqueName: \"kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f\") pod \"dnsmasq-dns-86db49b7ff-tqr9f\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:10 crc kubenswrapper[4939]: I0318 15:59:10.572720 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.505610 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" event={"ID":"73e35ccc-8cdd-45f3-ac02-aa04dd395d50","Type":"ContainerStarted","Data":"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a"} Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.506032 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.505760 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="dnsmasq-dns" containerID="cri-o://a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a" gracePeriod=10 Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.532219 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" podStartSLOduration=22.005513076 podStartE2EDuration="22.532201553s" podCreationTimestamp="2026-03-18 15:58:49 +0000 UTC" firstStartedPulling="2026-03-18 15:59:01.569295442 +0000 UTC m=+1306.168483063" lastFinishedPulling="2026-03-18 15:59:02.095983919 +0000 UTC m=+1306.695171540" observedRunningTime="2026-03-18 15:59:11.52854205 +0000 UTC m=+1316.127729671" watchObservedRunningTime="2026-03-18 15:59:11.532201553 +0000 UTC m=+1316.131389174" Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.559726 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.644362 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:11 crc kubenswrapper[4939]: I0318 15:59:11.713125 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:11 crc kubenswrapper[4939]: W0318 15:59:11.953204 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5af3edf_25e9_43a8_9898_383b6704cb85.slice/crio-518a223778c2995bb27e2aa72b79a378a1d727aa83cba954141c089251e2b13d WatchSource:0}: Error finding container 518a223778c2995bb27e2aa72b79a378a1d727aa83cba954141c089251e2b13d: Status 404 returned error can't find the container with id 518a223778c2995bb27e2aa72b79a378a1d727aa83cba954141c089251e2b13d Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.385685 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.441654 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config\") pod \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.441960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc\") pod \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.442027 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8msn\" (UniqueName: \"kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn\") pod \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\" (UID: \"73e35ccc-8cdd-45f3-ac02-aa04dd395d50\") " Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.455863 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn" (OuterVolumeSpecName: "kube-api-access-p8msn") pod "73e35ccc-8cdd-45f3-ac02-aa04dd395d50" (UID: "73e35ccc-8cdd-45f3-ac02-aa04dd395d50"). InnerVolumeSpecName "kube-api-access-p8msn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.514601 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" event={"ID":"cb736cd5-af39-49e8-8439-5402913687b1","Type":"ContainerStarted","Data":"6f17af1424f14b0e72b51e8dd2282dc6917c2178b6ebbb42cdb417e4db006f3d"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.517701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" event={"ID":"c4268347-7fb1-4cd7-9354-4d7babd5ce94","Type":"ContainerStarted","Data":"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.517847 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="dnsmasq-dns" containerID="cri-o://5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf" gracePeriod=10 Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.518038 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.520035 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" event={"ID":"c5af3edf-25e9-43a8-9898-383b6704cb85","Type":"ContainerStarted","Data":"518a223778c2995bb27e2aa72b79a378a1d727aa83cba954141c089251e2b13d"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.524458 4939 generic.go:334] "Generic (PLEG): container finished" podID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerID="a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a" exitCode=0 Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.524557 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" event={"ID":"73e35ccc-8cdd-45f3-ac02-aa04dd395d50","Type":"ContainerDied","Data":"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.524601 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.524639 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wfpp" event={"ID":"73e35ccc-8cdd-45f3-ac02-aa04dd395d50","Type":"ContainerDied","Data":"e50e97b08ab67450ffb2721880461bf78aa6e47f187d5af44265867638778d24"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.524668 4939 scope.go:117] "RemoveContainer" containerID="a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.526195 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8b45" event={"ID":"d66d88bf-9a85-4958-a731-258e55b7ae99","Type":"ContainerStarted","Data":"d2ce4dbece6686c3452b2c0b1ad45fe5a02108b975ec72d873f64d8337a53996"} Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.539916 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" podStartSLOduration=23.049402702 podStartE2EDuration="23.539895366s" podCreationTimestamp="2026-03-18 15:58:49 +0000 UTC" firstStartedPulling="2026-03-18 15:59:01.611233887 +0000 UTC m=+1306.210421508" lastFinishedPulling="2026-03-18 15:59:02.101726551 +0000 UTC m=+1306.700914172" observedRunningTime="2026-03-18 15:59:12.537818867 +0000 UTC m=+1317.137006498" watchObservedRunningTime="2026-03-18 15:59:12.539895366 +0000 UTC m=+1317.139083007" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.545034 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8msn\" (UniqueName: \"kubernetes.io/projected/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-kube-api-access-p8msn\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.769195 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config" (OuterVolumeSpecName: "config") pod "73e35ccc-8cdd-45f3-ac02-aa04dd395d50" (UID: "73e35ccc-8cdd-45f3-ac02-aa04dd395d50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.787234 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73e35ccc-8cdd-45f3-ac02-aa04dd395d50" (UID: "73e35ccc-8cdd-45f3-ac02-aa04dd395d50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.849680 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.849708 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73e35ccc-8cdd-45f3-ac02-aa04dd395d50-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.903876 4939 scope.go:117] "RemoveContainer" containerID="a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85" Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.957952 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:59:12 crc kubenswrapper[4939]: I0318 15:59:12.968444 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wfpp"] Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.001819 4939 scope.go:117] "RemoveContainer" containerID="a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a" Mar 18 15:59:13 crc kubenswrapper[4939]: E0318 15:59:13.002181 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a\": container with ID starting with a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a not found: ID does not exist" containerID="a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.002233 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a"} err="failed to get container status \"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a\": rpc error: code = NotFound desc = could not find container \"a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a\": container with ID starting with a1875897c6541779bd8ac0dc854a4b6b0d235d854b7e21ae13bd117643dcb07a not found: ID does not exist" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.002261 4939 scope.go:117] "RemoveContainer" containerID="a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85" Mar 18 15:59:13 crc kubenswrapper[4939]: E0318 15:59:13.004524 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85\": container with ID starting with a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85 not found: ID does not exist" containerID="a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.004550 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85"} err="failed to get container status \"a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85\": rpc error: code = NotFound desc = could not find container \"a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85\": container with ID starting with a3d722df81e16e730fd75a21aaf72794e8800ada6ca759d1fde3319eb0f74f85 not found: ID does not exist" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.163821 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.258099 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2gzw\" (UniqueName: \"kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw\") pod \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.258426 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config\") pod \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.258685 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc\") pod \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\" (UID: \"c4268347-7fb1-4cd7-9354-4d7babd5ce94\") " Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.263587 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw" (OuterVolumeSpecName: "kube-api-access-r2gzw") pod "c4268347-7fb1-4cd7-9354-4d7babd5ce94" (UID: "c4268347-7fb1-4cd7-9354-4d7babd5ce94"). InnerVolumeSpecName "kube-api-access-r2gzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.302393 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config" (OuterVolumeSpecName: "config") pod "c4268347-7fb1-4cd7-9354-4d7babd5ce94" (UID: "c4268347-7fb1-4cd7-9354-4d7babd5ce94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.304917 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4268347-7fb1-4cd7-9354-4d7babd5ce94" (UID: "c4268347-7fb1-4cd7-9354-4d7babd5ce94"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4939]: E0318 15:59:13.357084 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5af3edf_25e9_43a8_9898_383b6704cb85.slice/crio-conmon-1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f68996_05bc_4432_ac98_c730b09c6288.slice/crio-conmon-d8c4a29ac08df81c707c9e1ea61ddab446599268aa2b11f172231c3581a36abf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f68996_05bc_4432_ac98_c730b09c6288.slice/crio-d8c4a29ac08df81c707c9e1ea61ddab446599268aa2b11f172231c3581a36abf.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.362081 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.362120 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2gzw\" (UniqueName: \"kubernetes.io/projected/c4268347-7fb1-4cd7-9354-4d7babd5ce94-kube-api-access-r2gzw\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.362133 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4268347-7fb1-4cd7-9354-4d7babd5ce94-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.535538 4939 generic.go:334] "Generic (PLEG): container finished" podID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerID="5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf" exitCode=0 Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.535579 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.535606 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" event={"ID":"c4268347-7fb1-4cd7-9354-4d7babd5ce94","Type":"ContainerDied","Data":"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.535631 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vtfqn" event={"ID":"c4268347-7fb1-4cd7-9354-4d7babd5ce94","Type":"ContainerDied","Data":"78efc61b71dabc2f4db549d019a7a66a1635c1f54517c7dc601ea920dfe321bc"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.535647 4939 scope.go:117] "RemoveContainer" containerID="5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.538112 4939 generic.go:334] "Generic (PLEG): container finished" podID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerID="1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909" exitCode=0 Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.538180 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" event={"ID":"c5af3edf-25e9-43a8-9898-383b6704cb85","Type":"ContainerDied","Data":"1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.545167 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"658fa1ab-1e7e-42d2-947e-6c74215e15f0","Type":"ContainerStarted","Data":"7ef584cd49ae7f589ecbcd4e4179803996e249f944e8b538430057de2525fe26"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.545289 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.549375 4939 generic.go:334] "Generic (PLEG): container finished" podID="52f68996-05bc-4432-ac98-c730b09c6288" containerID="d8c4a29ac08df81c707c9e1ea61ddab446599268aa2b11f172231c3581a36abf" exitCode=0 Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.549428 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerDied","Data":"d8c4a29ac08df81c707c9e1ea61ddab446599268aa2b11f172231c3581a36abf"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.552231 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerStarted","Data":"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.570146 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerStarted","Data":"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.573877 4939 scope.go:117] "RemoveContainer" containerID="3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.577749 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerStarted","Data":"42c99df8c4f51d15393e20d51c9feb5cd6360994ff5aa297b56c8b97ec9c449e"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.611584 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36eb8417-2815-4eea-987e-d05be6eb90e9","Type":"ContainerStarted","Data":"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.612394 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.614953 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerStarted","Data":"9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.617297 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj" event={"ID":"1d3941f5-14fb-4ed6-a715-d4b99cb0961c","Type":"ContainerStarted","Data":"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.618062 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mp2sj" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.620639 4939 generic.go:334] "Generic (PLEG): container finished" podID="cb736cd5-af39-49e8-8439-5402913687b1" containerID="58362e26f9aa16b37dc396a86b360760d736a0ae7110dc366d3fffde8af0c835" exitCode=0 Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.620712 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" event={"ID":"cb736cd5-af39-49e8-8439-5402913687b1","Type":"ContainerDied","Data":"58362e26f9aa16b37dc396a86b360760d736a0ae7110dc366d3fffde8af0c835"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.624765 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.067072046 podStartE2EDuration="20.624748839s" podCreationTimestamp="2026-03-18 15:58:53 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.013917699 +0000 UTC m=+1306.613105320" lastFinishedPulling="2026-03-18 15:59:10.571594502 +0000 UTC m=+1315.170782113" observedRunningTime="2026-03-18 15:59:13.609013575 +0000 UTC m=+1318.208201206" watchObservedRunningTime="2026-03-18 15:59:13.624748839 +0000 UTC m=+1318.223936460" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.626429 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerStarted","Data":"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa"} Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.652791 4939 scope.go:117] "RemoveContainer" containerID="5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf" Mar 18 15:59:13 crc kubenswrapper[4939]: E0318 15:59:13.658056 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf\": container with ID starting with 5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf not found: ID does not exist" containerID="5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.658119 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf"} err="failed to get container status \"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf\": rpc error: code = NotFound desc = could not find container \"5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf\": container with ID starting with 5c57bb02502ea7ea38224674b58c7d4fea19592ccfc9d76747b71f7b3d14f0cf not found: ID does not exist" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.658152 4939 scope.go:117] "RemoveContainer" containerID="3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a" Mar 18 15:59:13 crc kubenswrapper[4939]: E0318 15:59:13.665454 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a\": container with ID starting with 3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a not found: ID does not exist" containerID="3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.665517 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a"} err="failed to get container status \"3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a\": rpc error: code = NotFound desc = could not find container \"3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a\": container with ID starting with 3d7fca19eb7fd597f6367fe16852b28c5b1109346466bb22e6a901f60509505a not found: ID does not exist" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.698357 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mp2sj" podStartSLOduration=6.609922275 podStartE2EDuration="15.69833827s" podCreationTimestamp="2026-03-18 15:58:58 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.020216827 +0000 UTC m=+1306.619404448" lastFinishedPulling="2026-03-18 15:59:11.108632822 +0000 UTC m=+1315.707820443" observedRunningTime="2026-03-18 15:59:13.698053302 +0000 UTC m=+1318.297240923" watchObservedRunningTime="2026-03-18 15:59:13.69833827 +0000 UTC m=+1318.297525891" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.730205 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.518432365 podStartE2EDuration="18.73018633s" podCreationTimestamp="2026-03-18 15:58:55 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.00617013 +0000 UTC m=+1306.605357741" lastFinishedPulling="2026-03-18 15:59:12.217924085 +0000 UTC m=+1316.817111706" observedRunningTime="2026-03-18 15:59:13.729941503 +0000 UTC m=+1318.329129144" watchObservedRunningTime="2026-03-18 15:59:13.73018633 +0000 UTC m=+1318.329373951" Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.751614 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:59:13 crc kubenswrapper[4939]: I0318 15:59:13.762178 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vtfqn"] Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.144971 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" path="/var/lib/kubelet/pods/73e35ccc-8cdd-45f3-ac02-aa04dd395d50/volumes" Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.145955 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" path="/var/lib/kubelet/pods/c4268347-7fb1-4cd7-9354-4d7babd5ce94/volumes" Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.634127 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerStarted","Data":"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52"} Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.638812 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" event={"ID":"c5af3edf-25e9-43a8-9898-383b6704cb85","Type":"ContainerStarted","Data":"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375"} Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.639071 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.640606 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" event={"ID":"cb736cd5-af39-49e8-8439-5402913687b1","Type":"ContainerStarted","Data":"01ef4d987b8440069b0987a2fdbd2ef4152e322b8eb7d478c1906bb4341623bb"} Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.641387 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.643558 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerStarted","Data":"de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a"} Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.675365 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" podStartSLOduration=4.675345705 podStartE2EDuration="4.675345705s" podCreationTimestamp="2026-03-18 15:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:14.672364461 +0000 UTC m=+1319.271552102" watchObservedRunningTime="2026-03-18 15:59:14.675345705 +0000 UTC m=+1319.274533326" Mar 18 15:59:14 crc kubenswrapper[4939]: I0318 15:59:14.693929 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" podStartSLOduration=5.69391173 podStartE2EDuration="5.69391173s" podCreationTimestamp="2026-03-18 15:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:14.685235654 +0000 UTC m=+1319.284423275" watchObservedRunningTime="2026-03-18 15:59:14.69391173 +0000 UTC m=+1319.293099351" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.660965 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8b45" event={"ID":"d66d88bf-9a85-4958-a731-258e55b7ae99","Type":"ContainerStarted","Data":"8c56802c6a7daa3601b078a23b4cc0855237da751647d5e5c060ea419f09f0f9"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.664877 4939 generic.go:334] "Generic (PLEG): container finished" podID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerID="1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa" exitCode=0 Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.664917 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerDied","Data":"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.670071 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerStarted","Data":"8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.670243 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.670268 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.673150 4939 generic.go:334] "Generic (PLEG): container finished" podID="22b02f38-8ae3-4a43-8df6-370521328921" containerID="4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7" exitCode=0 Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.673249 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerDied","Data":"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.676060 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerStarted","Data":"4e455ae5d5a3238bfc635b57f221285060c55f8f3b0f69f228d7592bc6b0442d"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.679670 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerStarted","Data":"16ff00962bb2fdbc01658731042bb863bfb945a5da3765b64fe956c1721303df"} Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.696942 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-w8b45" podStartSLOduration=3.928576552 podStartE2EDuration="7.696907444s" podCreationTimestamp="2026-03-18 15:59:09 +0000 UTC" firstStartedPulling="2026-03-18 15:59:11.98779164 +0000 UTC m=+1316.586979261" lastFinishedPulling="2026-03-18 15:59:15.756122532 +0000 UTC m=+1320.355310153" observedRunningTime="2026-03-18 15:59:16.68615294 +0000 UTC m=+1321.285340601" watchObservedRunningTime="2026-03-18 15:59:16.696907444 +0000 UTC m=+1321.296095095" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.734884 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.079041926 podStartE2EDuration="18.734861817s" podCreationTimestamp="2026-03-18 15:58:58 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.099850528 +0000 UTC m=+1306.699038149" lastFinishedPulling="2026-03-18 15:59:15.755670419 +0000 UTC m=+1320.354858040" observedRunningTime="2026-03-18 15:59:16.729861135 +0000 UTC m=+1321.329048776" watchObservedRunningTime="2026-03-18 15:59:16.734861817 +0000 UTC m=+1321.334049438" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.869915 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-56pdq" podStartSLOduration=11.039634 podStartE2EDuration="18.869897773s" podCreationTimestamp="2026-03-18 15:58:58 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.807414077 +0000 UTC m=+1307.406601688" lastFinishedPulling="2026-03-18 15:59:10.63767784 +0000 UTC m=+1315.236865461" observedRunningTime="2026-03-18 15:59:16.831103757 +0000 UTC m=+1321.430291378" watchObservedRunningTime="2026-03-18 15:59:16.869897773 +0000 UTC m=+1321.469085394" Mar 18 15:59:16 crc kubenswrapper[4939]: I0318 15:59:16.883637 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.992292354 podStartE2EDuration="15.883617111s" podCreationTimestamp="2026-03-18 15:59:01 +0000 UTC" firstStartedPulling="2026-03-18 15:59:03.884041119 +0000 UTC m=+1308.483228740" lastFinishedPulling="2026-03-18 15:59:15.775365876 +0000 UTC m=+1320.374553497" observedRunningTime="2026-03-18 15:59:16.861954599 +0000 UTC m=+1321.461142220" watchObservedRunningTime="2026-03-18 15:59:16.883617111 +0000 UTC m=+1321.482804732" Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.687276 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.687684 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.689326 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerStarted","Data":"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29"} Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.691537 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerStarted","Data":"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb"} Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.716752 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.09377248 podStartE2EDuration="26.716736019s" podCreationTimestamp="2026-03-18 15:58:51 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.014736542 +0000 UTC m=+1306.613924163" lastFinishedPulling="2026-03-18 15:59:10.637700081 +0000 UTC m=+1315.236887702" observedRunningTime="2026-03-18 15:59:17.711262555 +0000 UTC m=+1322.310450186" watchObservedRunningTime="2026-03-18 15:59:17.716736019 +0000 UTC m=+1322.315923640" Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.735288 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.62462077 podStartE2EDuration="27.735274283s" podCreationTimestamp="2026-03-18 15:58:50 +0000 UTC" firstStartedPulling="2026-03-18 15:59:01.998347419 +0000 UTC m=+1306.597535040" lastFinishedPulling="2026-03-18 15:59:11.109000942 +0000 UTC m=+1315.708188553" observedRunningTime="2026-03-18 15:59:17.731809376 +0000 UTC m=+1322.330996997" watchObservedRunningTime="2026-03-18 15:59:17.735274283 +0000 UTC m=+1322.334461904" Mar 18 15:59:17 crc kubenswrapper[4939]: I0318 15:59:17.755976 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.192105 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.237051 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.430534 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.697831 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.736670 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.752324 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982294 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:59:18 crc kubenswrapper[4939]: E0318 15:59:18.982627 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="init" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982642 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="init" Mar 18 15:59:18 crc kubenswrapper[4939]: E0318 15:59:18.982665 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="init" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982672 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="init" Mar 18 15:59:18 crc kubenswrapper[4939]: E0318 15:59:18.982689 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982696 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: E0318 15:59:18.982716 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982721 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982866 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4268347-7fb1-4cd7-9354-4d7babd5ce94" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.982877 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e35ccc-8cdd-45f3-ac02-aa04dd395d50" containerName="dnsmasq-dns" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.989969 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.993555 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.993843 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.994051 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 15:59:18 crc kubenswrapper[4939]: I0318 15:59:18.994431 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ftt6q" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.009094 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171245 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171616 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171658 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171714 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171742 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171783 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62f5k\" (UniqueName: \"kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.171809 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.273014 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.273473 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.273576 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.274296 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62f5k\" (UniqueName: \"kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.274325 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.274407 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.274431 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.274469 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.275177 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.275340 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.278985 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.279158 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.292122 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62f5k\" (UniqueName: \"kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.295267 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.307961 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:59:19 crc kubenswrapper[4939]: I0318 15:59:19.777043 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:59:19 crc kubenswrapper[4939]: W0318 15:59:19.780562 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18740c60_7bc8_4daa_a426_1aa624b7ac8a.slice/crio-cd17bd8a7f0793d03168519bbe0564c8c7b33342cf0477f35594b7efc8057f3b WatchSource:0}: Error finding container cd17bd8a7f0793d03168519bbe0564c8c7b33342cf0477f35594b7efc8057f3b: Status 404 returned error can't find the container with id cd17bd8a7f0793d03168519bbe0564c8c7b33342cf0477f35594b7efc8057f3b Mar 18 15:59:20 crc kubenswrapper[4939]: I0318 15:59:20.400702 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:20 crc kubenswrapper[4939]: I0318 15:59:20.574718 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:20 crc kubenswrapper[4939]: I0318 15:59:20.638834 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:20 crc kubenswrapper[4939]: I0318 15:59:20.712192 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerStarted","Data":"cd17bd8a7f0793d03168519bbe0564c8c7b33342cf0477f35594b7efc8057f3b"} Mar 18 15:59:20 crc kubenswrapper[4939]: I0318 15:59:20.712384 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="dnsmasq-dns" containerID="cri-o://00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375" gracePeriod=10 Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.324789 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.415927 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxwm\" (UniqueName: \"kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm\") pod \"c5af3edf-25e9-43a8-9898-383b6704cb85\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.416696 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc\") pod \"c5af3edf-25e9-43a8-9898-383b6704cb85\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.417194 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb\") pod \"c5af3edf-25e9-43a8-9898-383b6704cb85\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.417269 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config\") pod \"c5af3edf-25e9-43a8-9898-383b6704cb85\" (UID: \"c5af3edf-25e9-43a8-9898-383b6704cb85\") " Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.420078 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm" (OuterVolumeSpecName: "kube-api-access-dgxwm") pod "c5af3edf-25e9-43a8-9898-383b6704cb85" (UID: "c5af3edf-25e9-43a8-9898-383b6704cb85"). InnerVolumeSpecName "kube-api-access-dgxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.468012 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5af3edf-25e9-43a8-9898-383b6704cb85" (UID: "c5af3edf-25e9-43a8-9898-383b6704cb85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.469260 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config" (OuterVolumeSpecName: "config") pod "c5af3edf-25e9-43a8-9898-383b6704cb85" (UID: "c5af3edf-25e9-43a8-9898-383b6704cb85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.469796 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5af3edf-25e9-43a8-9898-383b6704cb85" (UID: "c5af3edf-25e9-43a8-9898-383b6704cb85"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.518730 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxwm\" (UniqueName: \"kubernetes.io/projected/c5af3edf-25e9-43a8-9898-383b6704cb85-kube-api-access-dgxwm\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.519166 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.519182 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.519193 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5af3edf-25e9-43a8-9898-383b6704cb85-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.719847 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerStarted","Data":"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e"} Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.719912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerStarted","Data":"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad"} Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.720025 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.721793 4939 generic.go:334] "Generic (PLEG): container finished" podID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerID="00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375" exitCode=0 Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.721832 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" event={"ID":"c5af3edf-25e9-43a8-9898-383b6704cb85","Type":"ContainerDied","Data":"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375"} Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.721866 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" event={"ID":"c5af3edf-25e9-43a8-9898-383b6704cb85","Type":"ContainerDied","Data":"518a223778c2995bb27e2aa72b79a378a1d727aa83cba954141c089251e2b13d"} Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.721887 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qw9r2" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.721888 4939 scope.go:117] "RemoveContainer" containerID="00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.739275 4939 scope.go:117] "RemoveContainer" containerID="1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.746703 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.420616065 podStartE2EDuration="3.746678156s" podCreationTimestamp="2026-03-18 15:59:18 +0000 UTC" firstStartedPulling="2026-03-18 15:59:19.783462325 +0000 UTC m=+1324.382649946" lastFinishedPulling="2026-03-18 15:59:21.109524416 +0000 UTC m=+1325.708712037" observedRunningTime="2026-03-18 15:59:21.739627946 +0000 UTC m=+1326.338815567" watchObservedRunningTime="2026-03-18 15:59:21.746678156 +0000 UTC m=+1326.345865777" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.758551 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.769318 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qw9r2"] Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.772812 4939 scope.go:117] "RemoveContainer" containerID="00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375" Mar 18 15:59:21 crc kubenswrapper[4939]: E0318 15:59:21.773315 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375\": container with ID starting with 00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375 not found: ID does not exist" containerID="00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.773356 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375"} err="failed to get container status \"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375\": rpc error: code = NotFound desc = could not find container \"00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375\": container with ID starting with 00510284b9c5149077cb90ffb64c1505258789591941105815313e3752652375 not found: ID does not exist" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.773382 4939 scope.go:117] "RemoveContainer" containerID="1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909" Mar 18 15:59:21 crc kubenswrapper[4939]: E0318 15:59:21.773809 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909\": container with ID starting with 1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909 not found: ID does not exist" containerID="1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.773833 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909"} err="failed to get container status \"1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909\": rpc error: code = NotFound desc = could not find container \"1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909\": container with ID starting with 1b911959c68a1f62a94badb4853aa1ae11e5b4629cc0631537d38752a1cf4909 not found: ID does not exist" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.842716 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.843151 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 15:59:21 crc kubenswrapper[4939]: I0318 15:59:21.922735 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 15:59:22 crc kubenswrapper[4939]: I0318 15:59:22.142173 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" path="/var/lib/kubelet/pods/c5af3edf-25e9-43a8-9898-383b6704cb85/volumes" Mar 18 15:59:22 crc kubenswrapper[4939]: I0318 15:59:22.834642 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 15:59:23 crc kubenswrapper[4939]: I0318 15:59:23.176198 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 15:59:23 crc kubenswrapper[4939]: I0318 15:59:23.176253 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 15:59:23 crc kubenswrapper[4939]: I0318 15:59:23.260825 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 15:59:23 crc kubenswrapper[4939]: I0318 15:59:23.835274 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.289296 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fdf2-account-create-update-5wwtx"] Mar 18 15:59:24 crc kubenswrapper[4939]: E0318 15:59:24.290091 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="init" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.290117 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="init" Mar 18 15:59:24 crc kubenswrapper[4939]: E0318 15:59:24.290187 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="dnsmasq-dns" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.290199 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="dnsmasq-dns" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.290430 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5af3edf-25e9-43a8-9898-383b6704cb85" containerName="dnsmasq-dns" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.291101 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.296017 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.296610 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fdf2-account-create-update-5wwtx"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.346896 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f7sh4"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.347883 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.362612 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.362667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9kjh\" (UniqueName: \"kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.369395 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f7sh4"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.464413 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.464471 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkhhp\" (UniqueName: \"kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.464622 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.464665 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9kjh\" (UniqueName: \"kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.465482 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.486647 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9kjh\" (UniqueName: \"kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh\") pod \"keystone-fdf2-account-create-update-5wwtx\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.526955 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hlnxc"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.527880 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.538197 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-aa27-account-create-update-zrtw8"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.539175 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.541176 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.547135 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hlnxc"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.564732 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-aa27-account-create-update-zrtw8"] Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.565602 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zpqk\" (UniqueName: \"kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.565751 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.565773 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.565789 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkhhp\" (UniqueName: \"kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.566581 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.582726 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkhhp\" (UniqueName: \"kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp\") pod \"keystone-db-create-f7sh4\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.608870 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.667571 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zpqk\" (UniqueName: \"kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.667717 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl995\" (UniqueName: \"kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.667779 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.667816 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.669121 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.671846 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.690012 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zpqk\" (UniqueName: \"kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk\") pod \"placement-db-create-hlnxc\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.769568 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl995\" (UniqueName: \"kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.769954 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.770772 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.794788 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl995\" (UniqueName: \"kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995\") pod \"placement-aa27-account-create-update-zrtw8\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.851650 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:24 crc kubenswrapper[4939]: I0318 15:59:24.868995 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.176775 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fdf2-account-create-update-5wwtx"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.352391 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f7sh4"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.412797 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.414145 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.424221 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.468042 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.492732 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-aa27-account-create-update-zrtw8"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.494312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jkg\" (UniqueName: \"kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.494405 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.494478 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.495030 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.495136 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.556208 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hlnxc"] Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.595843 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jkg\" (UniqueName: \"kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.595919 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.595956 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.595993 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.596041 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.596896 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.597741 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.598268 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.598993 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.629224 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jkg\" (UniqueName: \"kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg\") pod \"dnsmasq-dns-698758b865-d2bg8\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.733555 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.757446 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fdf2-account-create-update-5wwtx" event={"ID":"3247c802-2337-43e1-b292-56c7f5c520c2","Type":"ContainerStarted","Data":"dfaa1d5274ae3a52d2daa5b36bb108871f5aa90d9caef7604fc994fe36eb6d68"} Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.781089 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hlnxc" event={"ID":"71d349c7-2307-47dd-a696-adfdfba42e1e","Type":"ContainerStarted","Data":"9dd6e93e98a1691589611c4eec360191726b80dd894edfbcb6016cb73d2e1377"} Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.787316 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-zrtw8" event={"ID":"706a1ac1-f38c-4710-a889-b2799e52a652","Type":"ContainerStarted","Data":"102879e7e6c1fad41be5a3fc6e50e765ed8d2b7ee0b202e934da14df34235fee"} Mar 18 15:59:25 crc kubenswrapper[4939]: I0318 15:59:25.788951 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f7sh4" event={"ID":"961b9311-60b6-40e9-839a-a40bb6859bb3","Type":"ContainerStarted","Data":"c8d7ff19a8cfe70f801770258c277066110e6ef16807e812cbfe557ae03421a8"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.245102 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.610523 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.616913 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.618855 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.624042 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rbwfp" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.624332 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.624436 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.632109 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.796201 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-d2bg8" event={"ID":"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5","Type":"ContainerStarted","Data":"6a4490c2948a81ac56194edd5e9044723e0bcbac8c143dadd334ed8d8772fb8b"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.818685 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.818726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.818757 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.818821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.819121 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.819301 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ksvb\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.920766 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.920821 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.920925 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.920990 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ksvb\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.921014 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.921032 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:26.921098 4939 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:26.921133 4939 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.921151 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:26.921193 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift podName:ee0f94f0-d475-4921-9d83-357a8e436f33 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:27.421171432 +0000 UTC m=+1332.020359063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift") pod "swift-storage-0" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33") : configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.921668 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.921839 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.934150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.946390 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ksvb\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:26.947920 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.134374 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-25g5r"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.135912 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.139583 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.139695 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.139829 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.146625 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25g5r"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327490 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327558 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327598 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49r96\" (UniqueName: \"kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327628 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327801 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.327968 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.328151 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.429918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.429981 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430068 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430125 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49r96\" (UniqueName: \"kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430171 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:27.430186 4939 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:27.430206 4939 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430222 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:27.430254 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift podName:ee0f94f0-d475-4921-9d83-357a8e436f33 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:28.430236001 +0000 UTC m=+1333.029423622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift") pod "swift-storage-0" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33") : configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430963 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.430996 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.431244 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.433194 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.433922 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.434379 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.463208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49r96\" (UniqueName: \"kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96\") pod \"swift-ring-rebalance-25g5r\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:27.467123 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.449878 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:28.450107 4939 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:28.450136 4939 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:28.450206 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift podName:ee0f94f0-d475-4921-9d83-357a8e436f33 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:30.45018423 +0000 UTC m=+1335.049371851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift") pod "swift-storage-0" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33") : configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.551316 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fb9n8"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.552271 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.565893 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fb9n8"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.653646 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.653705 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbfn\" (UniqueName: \"kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.663782 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-29ff-account-create-update-vj982"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.665245 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.667964 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.672812 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-29ff-account-create-update-vj982"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.773901 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xrm\" (UniqueName: \"kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.774561 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.774605 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.774647 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbfn\" (UniqueName: \"kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.775397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.811632 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbfn\" (UniqueName: \"kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn\") pod \"glance-db-create-fb9n8\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.818390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fdf2-account-create-update-5wwtx" event={"ID":"3247c802-2337-43e1-b292-56c7f5c520c2","Type":"ContainerStarted","Data":"e0e971a7aa95c06696e32b2e72dbdcc749d30e39e71cbf719fef8099c56f0502"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.820023 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hlnxc" event={"ID":"71d349c7-2307-47dd-a696-adfdfba42e1e","Type":"ContainerStarted","Data":"4f93c5de887966d79cfc98a6c6b42da2198b27db1996aa7a8e956fa8ad27c205"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.821460 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-zrtw8" event={"ID":"706a1ac1-f38c-4710-a889-b2799e52a652","Type":"ContainerStarted","Data":"e42031844990c3cfe18c36afa3752df80cd853511882f8b3d46df4d8caa9ab69"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.822758 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f7sh4" event={"ID":"961b9311-60b6-40e9-839a-a40bb6859bb3","Type":"ContainerStarted","Data":"678d60bf8a444beedc86a2d299dd661f7fae4101a43640068f26d3d469a7b2aa"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.848386 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fdf2-account-create-update-5wwtx" podStartSLOduration=4.8483676639999995 podStartE2EDuration="4.848367664s" podCreationTimestamp="2026-03-18 15:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:28.843437375 +0000 UTC m=+1333.442625006" watchObservedRunningTime="2026-03-18 15:59:28.848367664 +0000 UTC m=+1333.447555295" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.862821 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-f7sh4" podStartSLOduration=4.862800932 podStartE2EDuration="4.862800932s" podCreationTimestamp="2026-03-18 15:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:28.85952996 +0000 UTC m=+1333.458717591" watchObservedRunningTime="2026-03-18 15:59:28.862800932 +0000 UTC m=+1333.461988553" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.868769 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.875711 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xrm\" (UniqueName: \"kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.875835 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.876599 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.892660 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xrm\" (UniqueName: \"kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm\") pod \"glance-29ff-account-create-update-vj982\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:28.982329 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.836412 4939 generic.go:334] "Generic (PLEG): container finished" podID="961b9311-60b6-40e9-839a-a40bb6859bb3" containerID="678d60bf8a444beedc86a2d299dd661f7fae4101a43640068f26d3d469a7b2aa" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.836556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f7sh4" event={"ID":"961b9311-60b6-40e9-839a-a40bb6859bb3","Type":"ContainerDied","Data":"678d60bf8a444beedc86a2d299dd661f7fae4101a43640068f26d3d469a7b2aa"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.843040 4939 generic.go:334] "Generic (PLEG): container finished" podID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerID="e05d4d95b37f1971f8f50e3c2124bf55b0dd6aa437c58136721c1a9e30107d40" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.843121 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-d2bg8" event={"ID":"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5","Type":"ContainerDied","Data":"e05d4d95b37f1971f8f50e3c2124bf55b0dd6aa437c58136721c1a9e30107d40"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.854748 4939 generic.go:334] "Generic (PLEG): container finished" podID="71d349c7-2307-47dd-a696-adfdfba42e1e" containerID="4f93c5de887966d79cfc98a6c6b42da2198b27db1996aa7a8e956fa8ad27c205" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.855496 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hlnxc" event={"ID":"71d349c7-2307-47dd-a696-adfdfba42e1e","Type":"ContainerDied","Data":"4f93c5de887966d79cfc98a6c6b42da2198b27db1996aa7a8e956fa8ad27c205"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:29.923744 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-aa27-account-create-update-zrtw8" podStartSLOduration=5.923721128 podStartE2EDuration="5.923721128s" podCreationTimestamp="2026-03-18 15:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:29.91633617 +0000 UTC m=+1334.515523791" watchObservedRunningTime="2026-03-18 15:59:29.923721128 +0000 UTC m=+1334.522908750" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.429674 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-29ff-account-create-update-vj982"] Mar 18 15:59:30 crc kubenswrapper[4939]: W0318 15:59:30.430674 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d904184_0bae_4dee_b7ce_b5e315763287.slice/crio-542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1 WatchSource:0}: Error finding container 542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1: Status 404 returned error can't find the container with id 542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1 Mar 18 15:59:30 crc kubenswrapper[4939]: W0318 15:59:30.431762 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2512655d_ad91_4302_9136_15c7ff20e928.slice/crio-5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731 WatchSource:0}: Error finding container 5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731: Status 404 returned error can't find the container with id 5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.438526 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fb9n8"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.456811 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cpqt5"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.457974 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.461872 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.470111 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cpqt5"] Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.496025 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25g5r"] Mar 18 15:59:30 crc kubenswrapper[4939]: W0318 15:59:30.500964 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeac75776_4245_474d_89c7_7002645a64c5.slice/crio-6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0 WatchSource:0}: Error finding container 6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0: Status 404 returned error can't find the container with id 6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.511783 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:30.511978 4939 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:30.512015 4939 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: E0318 15:59:30.512088 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift podName:ee0f94f0-d475-4921-9d83-357a8e436f33 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:34.512061408 +0000 UTC m=+1339.111249029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift") pod "swift-storage-0" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33") : configmap "swift-ring-files" not found Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.613292 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5pk\" (UniqueName: \"kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.613453 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.714686 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.715924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5pk\" (UniqueName: \"kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.715798 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.740370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5pk\" (UniqueName: \"kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk\") pod \"root-account-create-update-cpqt5\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.865724 4939 generic.go:334] "Generic (PLEG): container finished" podID="2512655d-ad91-4302-9136-15c7ff20e928" containerID="79931f9e3df7c056e097bbb2e2a50a13aac8f8b97f4469add924017b9d7aecda" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.865772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fb9n8" event={"ID":"2512655d-ad91-4302-9136-15c7ff20e928","Type":"ContainerDied","Data":"79931f9e3df7c056e097bbb2e2a50a13aac8f8b97f4469add924017b9d7aecda"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.865817 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fb9n8" event={"ID":"2512655d-ad91-4302-9136-15c7ff20e928","Type":"ContainerStarted","Data":"5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.873450 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-d2bg8" event={"ID":"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5","Type":"ContainerStarted","Data":"00cd63fb3658088fecc683853dba0e69afd4b7e3fdc1e25dabea499aedeb4916"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.874564 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.885805 4939 generic.go:334] "Generic (PLEG): container finished" podID="3247c802-2337-43e1-b292-56c7f5c520c2" containerID="e0e971a7aa95c06696e32b2e72dbdcc749d30e39e71cbf719fef8099c56f0502" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.885897 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fdf2-account-create-update-5wwtx" event={"ID":"3247c802-2337-43e1-b292-56c7f5c520c2","Type":"ContainerDied","Data":"e0e971a7aa95c06696e32b2e72dbdcc749d30e39e71cbf719fef8099c56f0502"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.887633 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25g5r" event={"ID":"eac75776-4245-474d-89c7-7002645a64c5","Type":"ContainerStarted","Data":"6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.889131 4939 generic.go:334] "Generic (PLEG): container finished" podID="706a1ac1-f38c-4710-a889-b2799e52a652" containerID="e42031844990c3cfe18c36afa3752df80cd853511882f8b3d46df4d8caa9ab69" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.889201 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-zrtw8" event={"ID":"706a1ac1-f38c-4710-a889-b2799e52a652","Type":"ContainerDied","Data":"e42031844990c3cfe18c36afa3752df80cd853511882f8b3d46df4d8caa9ab69"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.894746 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-29ff-account-create-update-vj982" event={"ID":"3d904184-0bae-4dee-b7ce-b5e315763287","Type":"ContainerStarted","Data":"5c740b8f48083a205dbc78e36790fcc31d2e6d6d2bd6b7443972daeffc991a8d"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.894796 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-29ff-account-create-update-vj982" event={"ID":"3d904184-0bae-4dee-b7ce-b5e315763287","Type":"ContainerStarted","Data":"542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1"} Mar 18 15:59:30 crc kubenswrapper[4939]: I0318 15:59:30.918953 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.074248 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-d2bg8" podStartSLOduration=6.074229498 podStartE2EDuration="6.074229498s" podCreationTimestamp="2026-03-18 15:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:31.054918702 +0000 UTC m=+1335.654106323" watchObservedRunningTime="2026-03-18 15:59:31.074229498 +0000 UTC m=+1335.673417119" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.483787 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.528547 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.546091 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zpqk\" (UniqueName: \"kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk\") pod \"71d349c7-2307-47dd-a696-adfdfba42e1e\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.546148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts\") pod \"71d349c7-2307-47dd-a696-adfdfba42e1e\" (UID: \"71d349c7-2307-47dd-a696-adfdfba42e1e\") " Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.547050 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71d349c7-2307-47dd-a696-adfdfba42e1e" (UID: "71d349c7-2307-47dd-a696-adfdfba42e1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.553229 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk" (OuterVolumeSpecName: "kube-api-access-2zpqk") pod "71d349c7-2307-47dd-a696-adfdfba42e1e" (UID: "71d349c7-2307-47dd-a696-adfdfba42e1e"). InnerVolumeSpecName "kube-api-access-2zpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.651388 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkhhp\" (UniqueName: \"kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp\") pod \"961b9311-60b6-40e9-839a-a40bb6859bb3\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.651546 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts\") pod \"961b9311-60b6-40e9-839a-a40bb6859bb3\" (UID: \"961b9311-60b6-40e9-839a-a40bb6859bb3\") " Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.651811 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zpqk\" (UniqueName: \"kubernetes.io/projected/71d349c7-2307-47dd-a696-adfdfba42e1e-kube-api-access-2zpqk\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.651836 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71d349c7-2307-47dd-a696-adfdfba42e1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.652235 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "961b9311-60b6-40e9-839a-a40bb6859bb3" (UID: "961b9311-60b6-40e9-839a-a40bb6859bb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.658878 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp" (OuterVolumeSpecName: "kube-api-access-qkhhp") pod "961b9311-60b6-40e9-839a-a40bb6859bb3" (UID: "961b9311-60b6-40e9-839a-a40bb6859bb3"). InnerVolumeSpecName "kube-api-access-qkhhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.692438 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cpqt5"] Mar 18 15:59:31 crc kubenswrapper[4939]: W0318 15:59:31.693286 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3abfa68d_17ec_467b_848d_6186a43de0b9.slice/crio-066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a WatchSource:0}: Error finding container 066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a: Status 404 returned error can't find the container with id 066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.754332 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/961b9311-60b6-40e9-839a-a40bb6859bb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.754372 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkhhp\" (UniqueName: \"kubernetes.io/projected/961b9311-60b6-40e9-839a-a40bb6859bb3-kube-api-access-qkhhp\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.908708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f7sh4" event={"ID":"961b9311-60b6-40e9-839a-a40bb6859bb3","Type":"ContainerDied","Data":"c8d7ff19a8cfe70f801770258c277066110e6ef16807e812cbfe557ae03421a8"} Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.908742 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f7sh4" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.908762 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d7ff19a8cfe70f801770258c277066110e6ef16807e812cbfe557ae03421a8" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.925103 4939 generic.go:334] "Generic (PLEG): container finished" podID="3d904184-0bae-4dee-b7ce-b5e315763287" containerID="5c740b8f48083a205dbc78e36790fcc31d2e6d6d2bd6b7443972daeffc991a8d" exitCode=0 Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.925156 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-29ff-account-create-update-vj982" event={"ID":"3d904184-0bae-4dee-b7ce-b5e315763287","Type":"ContainerDied","Data":"5c740b8f48083a205dbc78e36790fcc31d2e6d6d2bd6b7443972daeffc991a8d"} Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.926945 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpqt5" event={"ID":"3abfa68d-17ec-467b-848d-6186a43de0b9","Type":"ContainerStarted","Data":"2fb46578d3eb85ceda48ac95f0b4ee5515773eab56143cabd3f30c0f08b2042e"} Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.926974 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpqt5" event={"ID":"3abfa68d-17ec-467b-848d-6186a43de0b9","Type":"ContainerStarted","Data":"066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a"} Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.929646 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hlnxc" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.929692 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hlnxc" event={"ID":"71d349c7-2307-47dd-a696-adfdfba42e1e","Type":"ContainerDied","Data":"9dd6e93e98a1691589611c4eec360191726b80dd894edfbcb6016cb73d2e1377"} Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.929925 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd6e93e98a1691589611c4eec360191726b80dd894edfbcb6016cb73d2e1377" Mar 18 15:59:31 crc kubenswrapper[4939]: I0318 15:59:31.960534 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-cpqt5" podStartSLOduration=1.9605171989999999 podStartE2EDuration="1.960517199s" podCreationTimestamp="2026-03-18 15:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:31.944849966 +0000 UTC m=+1336.544037597" watchObservedRunningTime="2026-03-18 15:59:31.960517199 +0000 UTC m=+1336.559704820" Mar 18 15:59:32 crc kubenswrapper[4939]: I0318 15:59:32.938546 4939 generic.go:334] "Generic (PLEG): container finished" podID="3abfa68d-17ec-467b-848d-6186a43de0b9" containerID="2fb46578d3eb85ceda48ac95f0b4ee5515773eab56143cabd3f30c0f08b2042e" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4939]: I0318 15:59:32.938743 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpqt5" event={"ID":"3abfa68d-17ec-467b-848d-6186a43de0b9","Type":"ContainerDied","Data":"2fb46578d3eb85ceda48ac95f0b4ee5515773eab56143cabd3f30c0f08b2042e"} Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.417161 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.487428 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts\") pod \"706a1ac1-f38c-4710-a889-b2799e52a652\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.487576 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl995\" (UniqueName: \"kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995\") pod \"706a1ac1-f38c-4710-a889-b2799e52a652\" (UID: \"706a1ac1-f38c-4710-a889-b2799e52a652\") " Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.487911 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "706a1ac1-f38c-4710-a889-b2799e52a652" (UID: "706a1ac1-f38c-4710-a889-b2799e52a652"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.521159 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995" (OuterVolumeSpecName: "kube-api-access-rl995") pod "706a1ac1-f38c-4710-a889-b2799e52a652" (UID: "706a1ac1-f38c-4710-a889-b2799e52a652"). InnerVolumeSpecName "kube-api-access-rl995". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.589561 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706a1ac1-f38c-4710-a889-b2799e52a652-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.589592 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl995\" (UniqueName: \"kubernetes.io/projected/706a1ac1-f38c-4710-a889-b2799e52a652-kube-api-access-rl995\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.952556 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-zrtw8" Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.952566 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-zrtw8" event={"ID":"706a1ac1-f38c-4710-a889-b2799e52a652","Type":"ContainerDied","Data":"102879e7e6c1fad41be5a3fc6e50e765ed8d2b7ee0b202e934da14df34235fee"} Mar 18 15:59:33 crc kubenswrapper[4939]: I0318 15:59:33.952629 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102879e7e6c1fad41be5a3fc6e50e765ed8d2b7ee0b202e934da14df34235fee" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.603204 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:34 crc kubenswrapper[4939]: E0318 15:59:34.603470 4939 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:59:34 crc kubenswrapper[4939]: E0318 15:59:34.603718 4939 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:59:34 crc kubenswrapper[4939]: E0318 15:59:34.603769 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift podName:ee0f94f0-d475-4921-9d83-357a8e436f33 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:42.60375111 +0000 UTC m=+1347.202938721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift") pod "swift-storage-0" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33") : configmap "swift-ring-files" not found Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.887831 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.897387 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.908044 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts\") pod \"3d904184-0bae-4dee-b7ce-b5e315763287\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.908206 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96xrm\" (UniqueName: \"kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm\") pod \"3d904184-0bae-4dee-b7ce-b5e315763287\" (UID: \"3d904184-0bae-4dee-b7ce-b5e315763287\") " Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.908241 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9kjh\" (UniqueName: \"kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh\") pod \"3247c802-2337-43e1-b292-56c7f5c520c2\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.908266 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts\") pod \"3247c802-2337-43e1-b292-56c7f5c520c2\" (UID: \"3247c802-2337-43e1-b292-56c7f5c520c2\") " Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.909310 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3247c802-2337-43e1-b292-56c7f5c520c2" (UID: "3247c802-2337-43e1-b292-56c7f5c520c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.910412 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d904184-0bae-4dee-b7ce-b5e315763287" (UID: "3d904184-0bae-4dee-b7ce-b5e315763287"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.916061 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh" (OuterVolumeSpecName: "kube-api-access-v9kjh") pod "3247c802-2337-43e1-b292-56c7f5c520c2" (UID: "3247c802-2337-43e1-b292-56c7f5c520c2"). InnerVolumeSpecName "kube-api-access-v9kjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.918549 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm" (OuterVolumeSpecName: "kube-api-access-96xrm") pod "3d904184-0bae-4dee-b7ce-b5e315763287" (UID: "3d904184-0bae-4dee-b7ce-b5e315763287"). InnerVolumeSpecName "kube-api-access-96xrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.964292 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fdf2-account-create-update-5wwtx" event={"ID":"3247c802-2337-43e1-b292-56c7f5c520c2","Type":"ContainerDied","Data":"dfaa1d5274ae3a52d2daa5b36bb108871f5aa90d9caef7604fc994fe36eb6d68"} Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.965427 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfaa1d5274ae3a52d2daa5b36bb108871f5aa90d9caef7604fc994fe36eb6d68" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.964524 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-5wwtx" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.965873 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-29ff-account-create-update-vj982" event={"ID":"3d904184-0bae-4dee-b7ce-b5e315763287","Type":"ContainerDied","Data":"542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1"} Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.965909 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="542cf118c3d26716b163e86947222b20284033ee196b3d1b947e4b9e92531af1" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.965949 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-29ff-account-create-update-vj982" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.972575 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fb9n8" event={"ID":"2512655d-ad91-4302-9136-15c7ff20e928","Type":"ContainerDied","Data":"5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731"} Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.972647 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5535c249e0a32cd231b23b30b966a89c373f78e5d8946a5f8a84a7b743696731" Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.978187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpqt5" event={"ID":"3abfa68d-17ec-467b-848d-6186a43de0b9","Type":"ContainerDied","Data":"066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a"} Mar 18 15:59:34 crc kubenswrapper[4939]: I0318 15:59:34.978222 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066b9f9e4c007442c209bebaec6dcc794fe2e9f854c05306f25df4d08bc4357a" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.010417 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96xrm\" (UniqueName: \"kubernetes.io/projected/3d904184-0bae-4dee-b7ce-b5e315763287-kube-api-access-96xrm\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.010451 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9kjh\" (UniqueName: \"kubernetes.io/projected/3247c802-2337-43e1-b292-56c7f5c520c2-kube-api-access-v9kjh\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.010461 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3247c802-2337-43e1-b292-56c7f5c520c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.010469 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d904184-0bae-4dee-b7ce-b5e315763287-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.024273 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.048688 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.111600 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j5pk\" (UniqueName: \"kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk\") pod \"3abfa68d-17ec-467b-848d-6186a43de0b9\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.111720 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vbfn\" (UniqueName: \"kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn\") pod \"2512655d-ad91-4302-9136-15c7ff20e928\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.111833 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts\") pod \"3abfa68d-17ec-467b-848d-6186a43de0b9\" (UID: \"3abfa68d-17ec-467b-848d-6186a43de0b9\") " Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.111872 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts\") pod \"2512655d-ad91-4302-9136-15c7ff20e928\" (UID: \"2512655d-ad91-4302-9136-15c7ff20e928\") " Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.112381 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3abfa68d-17ec-467b-848d-6186a43de0b9" (UID: "3abfa68d-17ec-467b-848d-6186a43de0b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.112381 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2512655d-ad91-4302-9136-15c7ff20e928" (UID: "2512655d-ad91-4302-9136-15c7ff20e928"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.115797 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn" (OuterVolumeSpecName: "kube-api-access-8vbfn") pod "2512655d-ad91-4302-9136-15c7ff20e928" (UID: "2512655d-ad91-4302-9136-15c7ff20e928"). InnerVolumeSpecName "kube-api-access-8vbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.116672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk" (OuterVolumeSpecName: "kube-api-access-2j5pk") pod "3abfa68d-17ec-467b-848d-6186a43de0b9" (UID: "3abfa68d-17ec-467b-848d-6186a43de0b9"). InnerVolumeSpecName "kube-api-access-2j5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.214686 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j5pk\" (UniqueName: \"kubernetes.io/projected/3abfa68d-17ec-467b-848d-6186a43de0b9-kube-api-access-2j5pk\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.214993 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vbfn\" (UniqueName: \"kubernetes.io/projected/2512655d-ad91-4302-9136-15c7ff20e928-kube-api-access-8vbfn\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.215059 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3abfa68d-17ec-467b-848d-6186a43de0b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.215129 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2512655d-ad91-4302-9136-15c7ff20e928-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.735158 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.793260 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.793553 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="dnsmasq-dns" containerID="cri-o://01ef4d987b8440069b0987a2fdbd2ef4152e322b8eb7d478c1906bb4341623bb" gracePeriod=10 Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.990307 4939 generic.go:334] "Generic (PLEG): container finished" podID="cb736cd5-af39-49e8-8439-5402913687b1" containerID="01ef4d987b8440069b0987a2fdbd2ef4152e322b8eb7d478c1906bb4341623bb" exitCode=0 Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.990381 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" event={"ID":"cb736cd5-af39-49e8-8439-5402913687b1","Type":"ContainerDied","Data":"01ef4d987b8440069b0987a2fdbd2ef4152e322b8eb7d478c1906bb4341623bb"} Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.993566 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25g5r" event={"ID":"eac75776-4245-474d-89c7-7002645a64c5","Type":"ContainerStarted","Data":"a657786e9cd27e7663fd8b3a78dd98fccda900df892a6912244ea73650607bf3"} Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.993613 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fb9n8" Mar 18 15:59:35 crc kubenswrapper[4939]: I0318 15:59:35.993663 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpqt5" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.017123 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-25g5r" podStartSLOduration=4.718674843 podStartE2EDuration="9.017106148s" podCreationTimestamp="2026-03-18 15:59:27 +0000 UTC" firstStartedPulling="2026-03-18 15:59:30.503349282 +0000 UTC m=+1335.102536903" lastFinishedPulling="2026-03-18 15:59:34.801780587 +0000 UTC m=+1339.400968208" observedRunningTime="2026-03-18 15:59:36.015444781 +0000 UTC m=+1340.614632412" watchObservedRunningTime="2026-03-18 15:59:36.017106148 +0000 UTC m=+1340.616293769" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.381332 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.534799 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d69f\" (UniqueName: \"kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f\") pod \"cb736cd5-af39-49e8-8439-5402913687b1\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.534894 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb\") pod \"cb736cd5-af39-49e8-8439-5402913687b1\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.535041 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb\") pod \"cb736cd5-af39-49e8-8439-5402913687b1\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.535079 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc\") pod \"cb736cd5-af39-49e8-8439-5402913687b1\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.535126 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config\") pod \"cb736cd5-af39-49e8-8439-5402913687b1\" (UID: \"cb736cd5-af39-49e8-8439-5402913687b1\") " Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.539838 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f" (OuterVolumeSpecName: "kube-api-access-6d69f") pod "cb736cd5-af39-49e8-8439-5402913687b1" (UID: "cb736cd5-af39-49e8-8439-5402913687b1"). InnerVolumeSpecName "kube-api-access-6d69f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.571654 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config" (OuterVolumeSpecName: "config") pod "cb736cd5-af39-49e8-8439-5402913687b1" (UID: "cb736cd5-af39-49e8-8439-5402913687b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.576651 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb736cd5-af39-49e8-8439-5402913687b1" (UID: "cb736cd5-af39-49e8-8439-5402913687b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.584820 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb736cd5-af39-49e8-8439-5402913687b1" (UID: "cb736cd5-af39-49e8-8439-5402913687b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.590441 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb736cd5-af39-49e8-8439-5402913687b1" (UID: "cb736cd5-af39-49e8-8439-5402913687b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.636471 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.636546 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.636557 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.636568 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d69f\" (UniqueName: \"kubernetes.io/projected/cb736cd5-af39-49e8-8439-5402913687b1-kube-api-access-6d69f\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.636581 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb736cd5-af39-49e8-8439-5402913687b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.809813 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cpqt5"] Mar 18 15:59:36 crc kubenswrapper[4939]: I0318 15:59:36.817907 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cpqt5"] Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.001713 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" event={"ID":"cb736cd5-af39-49e8-8439-5402913687b1","Type":"ContainerDied","Data":"6f17af1424f14b0e72b51e8dd2282dc6917c2178b6ebbb42cdb417e4db006f3d"} Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.001751 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-tqr9f" Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.001777 4939 scope.go:117] "RemoveContainer" containerID="01ef4d987b8440069b0987a2fdbd2ef4152e322b8eb7d478c1906bb4341623bb" Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.031094 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.034435 4939 scope.go:117] "RemoveContainer" containerID="58362e26f9aa16b37dc396a86b360760d736a0ae7110dc366d3fffde8af0c835" Mar 18 15:59:37 crc kubenswrapper[4939]: I0318 15:59:37.038188 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-tqr9f"] Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.144267 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abfa68d-17ec-467b-848d-6186a43de0b9" path="/var/lib/kubelet/pods/3abfa68d-17ec-467b-848d-6186a43de0b9/volumes" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.144795 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb736cd5-af39-49e8-8439-5402913687b1" path="/var/lib/kubelet/pods/cb736cd5-af39-49e8-8439-5402913687b1/volumes" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.880803 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sss6f"] Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881191 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d349c7-2307-47dd-a696-adfdfba42e1e" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881210 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d349c7-2307-47dd-a696-adfdfba42e1e" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881221 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abfa68d-17ec-467b-848d-6186a43de0b9" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881229 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abfa68d-17ec-467b-848d-6186a43de0b9" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881242 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3247c802-2337-43e1-b292-56c7f5c520c2" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881250 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3247c802-2337-43e1-b292-56c7f5c520c2" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881265 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d904184-0bae-4dee-b7ce-b5e315763287" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881272 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d904184-0bae-4dee-b7ce-b5e315763287" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881284 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="init" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881293 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="init" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881311 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="961b9311-60b6-40e9-839a-a40bb6859bb3" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881320 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="961b9311-60b6-40e9-839a-a40bb6859bb3" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881329 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2512655d-ad91-4302-9136-15c7ff20e928" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881336 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2512655d-ad91-4302-9136-15c7ff20e928" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881354 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706a1ac1-f38c-4710-a889-b2799e52a652" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881361 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="706a1ac1-f38c-4710-a889-b2799e52a652" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: E0318 15:59:38.881380 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="dnsmasq-dns" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881387 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="dnsmasq-dns" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881584 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abfa68d-17ec-467b-848d-6186a43de0b9" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881596 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="961b9311-60b6-40e9-839a-a40bb6859bb3" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881608 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="706a1ac1-f38c-4710-a889-b2799e52a652" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881624 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d349c7-2307-47dd-a696-adfdfba42e1e" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881632 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2512655d-ad91-4302-9136-15c7ff20e928" containerName="mariadb-database-create" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881643 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb736cd5-af39-49e8-8439-5402913687b1" containerName="dnsmasq-dns" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881655 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d904184-0bae-4dee-b7ce-b5e315763287" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.881670 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3247c802-2337-43e1-b292-56c7f5c520c2" containerName="mariadb-account-create-update" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.882303 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.884526 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.887327 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7rbvp" Mar 18 15:59:38 crc kubenswrapper[4939]: I0318 15:59:38.888679 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sss6f"] Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.074358 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.074457 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.074475 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwlx2\" (UniqueName: \"kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.074569 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.175931 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.176041 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwlx2\" (UniqueName: \"kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.177884 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.178101 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.182187 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.182637 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.183715 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.197259 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwlx2\" (UniqueName: \"kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2\") pod \"glance-db-sync-sss6f\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.207850 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sss6f" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.465976 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 15:59:39 crc kubenswrapper[4939]: I0318 15:59:39.848822 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sss6f"] Mar 18 15:59:39 crc kubenswrapper[4939]: W0318 15:59:39.874395 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e6de1a_22a0_4166_9bf0_f8844e3e89c2.slice/crio-24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d WatchSource:0}: Error finding container 24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d: Status 404 returned error can't find the container with id 24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.025522 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sss6f" event={"ID":"59e6de1a-22a0-4166-9bf0-f8844e3e89c2","Type":"ContainerStarted","Data":"24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d"} Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.490974 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p8lbk"] Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.495995 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.498604 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.507931 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p8lbk"] Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.618697 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7kd8\" (UniqueName: \"kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.618777 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.720589 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7kd8\" (UniqueName: \"kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.720666 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.721640 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.738802 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7kd8\" (UniqueName: \"kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8\") pod \"root-account-create-update-p8lbk\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:40 crc kubenswrapper[4939]: I0318 15:59:40.825395 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:41 crc kubenswrapper[4939]: I0318 15:59:41.324275 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p8lbk"] Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.047628 4939 generic.go:334] "Generic (PLEG): container finished" podID="d7fb4ca4-fe24-43de-8098-5d1b0effa406" containerID="5a028acbdf9be78d30ad8dac81f37e1a2ea94ee44f9dbb1a9a9f4685f2f7d586" exitCode=0 Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.047670 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p8lbk" event={"ID":"d7fb4ca4-fe24-43de-8098-5d1b0effa406","Type":"ContainerDied","Data":"5a028acbdf9be78d30ad8dac81f37e1a2ea94ee44f9dbb1a9a9f4685f2f7d586"} Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.047697 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p8lbk" event={"ID":"d7fb4ca4-fe24-43de-8098-5d1b0effa406","Type":"ContainerStarted","Data":"75e586f842fb1042d4945733904d0bf6820b0fbd03ae789ad122d4e5f47008c0"} Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.653965 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.667624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"swift-storage-0\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " pod="openstack/swift-storage-0" Mar 18 15:59:42 crc kubenswrapper[4939]: I0318 15:59:42.842129 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.061405 4939 generic.go:334] "Generic (PLEG): container finished" podID="eac75776-4245-474d-89c7-7002645a64c5" containerID="a657786e9cd27e7663fd8b3a78dd98fccda900df892a6912244ea73650607bf3" exitCode=0 Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.061472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25g5r" event={"ID":"eac75776-4245-474d-89c7-7002645a64c5","Type":"ContainerDied","Data":"a657786e9cd27e7663fd8b3a78dd98fccda900df892a6912244ea73650607bf3"} Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.354303 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.385252 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.467997 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7kd8\" (UniqueName: \"kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8\") pod \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.469046 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts\") pod \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\" (UID: \"d7fb4ca4-fe24-43de-8098-5d1b0effa406\") " Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.470255 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7fb4ca4-fe24-43de-8098-5d1b0effa406" (UID: "d7fb4ca4-fe24-43de-8098-5d1b0effa406"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.473937 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8" (OuterVolumeSpecName: "kube-api-access-h7kd8") pod "d7fb4ca4-fe24-43de-8098-5d1b0effa406" (UID: "d7fb4ca4-fe24-43de-8098-5d1b0effa406"). InnerVolumeSpecName "kube-api-access-h7kd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.528601 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:59:43 crc kubenswrapper[4939]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:59:43 crc kubenswrapper[4939]: > Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.571780 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7kd8\" (UniqueName: \"kubernetes.io/projected/d7fb4ca4-fe24-43de-8098-5d1b0effa406-kube-api-access-h7kd8\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:43 crc kubenswrapper[4939]: I0318 15:59:43.571816 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fb4ca4-fe24-43de-8098-5d1b0effa406-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.070320 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"1cf432016526d7d1c3c49d62eaf38e5832ce82056b8358cb26670f44db469f5d"} Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.071770 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p8lbk" event={"ID":"d7fb4ca4-fe24-43de-8098-5d1b0effa406","Type":"ContainerDied","Data":"75e586f842fb1042d4945733904d0bf6820b0fbd03ae789ad122d4e5f47008c0"} Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.071827 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e586f842fb1042d4945733904d0bf6820b0fbd03ae789ad122d4e5f47008c0" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.071791 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p8lbk" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.717281 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.895238 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.895619 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.895645 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.895741 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49r96\" (UniqueName: \"kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.896324 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.896321 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.896407 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.896483 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.896778 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle\") pod \"eac75776-4245-474d-89c7-7002645a64c5\" (UID: \"eac75776-4245-474d-89c7-7002645a64c5\") " Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.897156 4939 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eac75776-4245-474d-89c7-7002645a64c5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.897171 4939 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.898992 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96" (OuterVolumeSpecName: "kube-api-access-49r96") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "kube-api-access-49r96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.901467 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.922786 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts" (OuterVolumeSpecName: "scripts") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.924459 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.927310 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eac75776-4245-474d-89c7-7002645a64c5" (UID: "eac75776-4245-474d-89c7-7002645a64c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.998743 4939 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.998772 4939 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.998782 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac75776-4245-474d-89c7-7002645a64c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.998792 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eac75776-4245-474d-89c7-7002645a64c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:44 crc kubenswrapper[4939]: I0318 15:59:44.998801 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49r96\" (UniqueName: \"kubernetes.io/projected/eac75776-4245-474d-89c7-7002645a64c5-kube-api-access-49r96\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.090710 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25g5r" Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.090709 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25g5r" event={"ID":"eac75776-4245-474d-89c7-7002645a64c5","Type":"ContainerDied","Data":"6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0"} Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.090845 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cef057769c826daefed8587f8c73384b8b034797873ba90239fd3e72175bec0" Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.092645 4939 generic.go:334] "Generic (PLEG): container finished" podID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerID="55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429" exitCode=0 Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.092706 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerDied","Data":"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429"} Mar 18 15:59:45 crc kubenswrapper[4939]: I0318 15:59:45.098765 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"91a292f6fe7a26c5c29f8f381a1c70bc1a4b7445389fcb1c53cbb76f807c045d"} Mar 18 15:59:46 crc kubenswrapper[4939]: I0318 15:59:46.108877 4939 generic.go:334] "Generic (PLEG): container finished" podID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerID="7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52" exitCode=0 Mar 18 15:59:46 crc kubenswrapper[4939]: I0318 15:59:46.108970 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerDied","Data":"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52"} Mar 18 15:59:46 crc kubenswrapper[4939]: I0318 15:59:46.112735 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"d54c7d302d6af9bea4dac8164b0ce249c3aa366a90bdd01e9a0627f62c76b69d"} Mar 18 15:59:46 crc kubenswrapper[4939]: I0318 15:59:46.823489 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p8lbk"] Mar 18 15:59:46 crc kubenswrapper[4939]: I0318 15:59:46.829683 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p8lbk"] Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.142151 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7fb4ca4-fe24-43de-8098-5d1b0effa406" path="/var/lib/kubelet/pods/d7fb4ca4-fe24-43de-8098-5d1b0effa406/volumes" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.526023 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:59:48 crc kubenswrapper[4939]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:59:48 crc kubenswrapper[4939]: > Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.567421 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.618856 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.827702 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mp2sj-config-7c2tf"] Mar 18 15:59:48 crc kubenswrapper[4939]: E0318 15:59:48.828183 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7fb4ca4-fe24-43de-8098-5d1b0effa406" containerName="mariadb-account-create-update" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.828200 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7fb4ca4-fe24-43de-8098-5d1b0effa406" containerName="mariadb-account-create-update" Mar 18 15:59:48 crc kubenswrapper[4939]: E0318 15:59:48.828220 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac75776-4245-474d-89c7-7002645a64c5" containerName="swift-ring-rebalance" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.828229 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac75776-4245-474d-89c7-7002645a64c5" containerName="swift-ring-rebalance" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.828449 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac75776-4245-474d-89c7-7002645a64c5" containerName="swift-ring-rebalance" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.828482 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7fb4ca4-fe24-43de-8098-5d1b0effa406" containerName="mariadb-account-create-update" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.829171 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.831808 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.849008 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj-config-7c2tf"] Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959296 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zxrm\" (UniqueName: \"kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959354 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959454 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959649 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959684 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:48 crc kubenswrapper[4939]: I0318 15:59:48.959739 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061302 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zxrm\" (UniqueName: \"kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061351 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061377 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061690 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061777 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061803 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061831 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061774 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.061995 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.062582 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.064199 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.080306 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zxrm\" (UniqueName: \"kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm\") pod \"ovn-controller-mp2sj-config-7c2tf\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:49 crc kubenswrapper[4939]: I0318 15:59:49.158693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.830696 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8kz2r"] Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.832091 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.834543 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.853878 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8kz2r"] Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.930698 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gljsz\" (UniqueName: \"kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:51 crc kubenswrapper[4939]: I0318 15:59:51.930997 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:52 crc kubenswrapper[4939]: I0318 15:59:52.032356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:52 crc kubenswrapper[4939]: I0318 15:59:52.032742 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gljsz\" (UniqueName: \"kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:52 crc kubenswrapper[4939]: I0318 15:59:52.033176 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:52 crc kubenswrapper[4939]: I0318 15:59:52.049694 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gljsz\" (UniqueName: \"kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz\") pod \"root-account-create-update-8kz2r\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:52 crc kubenswrapper[4939]: I0318 15:59:52.164377 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:53 crc kubenswrapper[4939]: I0318 15:59:53.678163 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:59:53 crc kubenswrapper[4939]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:59:53 crc kubenswrapper[4939]: > Mar 18 15:59:55 crc kubenswrapper[4939]: E0318 15:59:55.870550 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 18 15:59:55 crc kubenswrapper[4939]: E0318 15:59:55.871265 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwlx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-sss6f_openstack(59e6de1a-22a0-4166-9bf0-f8844e3e89c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:59:55 crc kubenswrapper[4939]: E0318 15:59:55.872634 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-sss6f" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.196881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerStarted","Data":"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514"} Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.197399 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.200457 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"8577f19335a709c20a6140281f93e100fa6126302c44206e5de996f40595a70b"} Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.203070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerStarted","Data":"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f"} Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.203627 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:59:56 crc kubenswrapper[4939]: E0318 15:59:56.203743 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-sss6f" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.226436 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.511794091 podStartE2EDuration="1m7.226418701s" podCreationTimestamp="2026-03-18 15:58:49 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.032215936 +0000 UTC m=+1306.631403567" lastFinishedPulling="2026-03-18 15:59:10.746840556 +0000 UTC m=+1315.346028177" observedRunningTime="2026-03-18 15:59:56.224176421 +0000 UTC m=+1360.823364072" watchObservedRunningTime="2026-03-18 15:59:56.226418701 +0000 UTC m=+1360.825606322" Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.261214 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=58.544659467 podStartE2EDuration="1m7.26118873s" podCreationTimestamp="2026-03-18 15:58:49 +0000 UTC" firstStartedPulling="2026-03-18 15:59:02.032368521 +0000 UTC m=+1306.631556142" lastFinishedPulling="2026-03-18 15:59:10.748897784 +0000 UTC m=+1315.348085405" observedRunningTime="2026-03-18 15:59:56.25080769 +0000 UTC m=+1360.849995311" watchObservedRunningTime="2026-03-18 15:59:56.26118873 +0000 UTC m=+1360.860376351" Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.306731 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj-config-7c2tf"] Mar 18 15:59:56 crc kubenswrapper[4939]: W0318 15:59:56.308323 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7342df2_81b0_44ac_a232_fee5911b813a.slice/crio-8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63 WatchSource:0}: Error finding container 8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63: Status 404 returned error can't find the container with id 8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63 Mar 18 15:59:56 crc kubenswrapper[4939]: I0318 15:59:56.374020 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8kz2r"] Mar 18 15:59:56 crc kubenswrapper[4939]: W0318 15:59:56.378317 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ff3991_7d14_48ca_ae7b_049eba4736e4.slice/crio-ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2 WatchSource:0}: Error finding container ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2: Status 404 returned error can't find the container with id ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2 Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.211994 4939 generic.go:334] "Generic (PLEG): container finished" podID="e7342df2-81b0-44ac-a232-fee5911b813a" containerID="e1e5c6f6321fc70ca2c3f675371229a1d6d737697c3f271c25173d6a19cad769" exitCode=0 Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.212305 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-7c2tf" event={"ID":"e7342df2-81b0-44ac-a232-fee5911b813a","Type":"ContainerDied","Data":"e1e5c6f6321fc70ca2c3f675371229a1d6d737697c3f271c25173d6a19cad769"} Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.212334 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-7c2tf" event={"ID":"e7342df2-81b0-44ac-a232-fee5911b813a","Type":"ContainerStarted","Data":"8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63"} Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.218015 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"eb4726099480d94bfcadfcc8ac5e8c7e0a22e0445d91b54b8bd947be6096f473"} Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.219422 4939 generic.go:334] "Generic (PLEG): container finished" podID="21ff3991-7d14-48ca-ae7b-049eba4736e4" containerID="66593bb867c511d8f535e9c33ce158533f67eeac1a915004bfc396a9f8eaf965" exitCode=0 Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.220605 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8kz2r" event={"ID":"21ff3991-7d14-48ca-ae7b-049eba4736e4","Type":"ContainerDied","Data":"66593bb867c511d8f535e9c33ce158533f67eeac1a915004bfc396a9f8eaf965"} Mar 18 15:59:57 crc kubenswrapper[4939]: I0318 15:59:57.220636 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8kz2r" event={"ID":"21ff3991-7d14-48ca-ae7b-049eba4736e4","Type":"ContainerStarted","Data":"ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2"} Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.231008 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"f258992948b744aaf9d67e3c6a706143ae30356b748f5de908345ee552ac4c49"} Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.577660 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mp2sj" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.610688 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.654654 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760462 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760527 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760599 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zxrm\" (UniqueName: \"kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760644 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts\") pod \"21ff3991-7d14-48ca-ae7b-049eba4736e4\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760673 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gljsz\" (UniqueName: \"kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz\") pod \"21ff3991-7d14-48ca-ae7b-049eba4736e4\" (UID: \"21ff3991-7d14-48ca-ae7b-049eba4736e4\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760694 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760786 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760833 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn\") pod \"e7342df2-81b0-44ac-a232-fee5911b813a\" (UID: \"e7342df2-81b0-44ac-a232-fee5911b813a\") " Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760586 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run" (OuterVolumeSpecName: "var-run") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.760928 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761037 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761293 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761330 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21ff3991-7d14-48ca-ae7b-049eba4736e4" (UID: "21ff3991-7d14-48ca-ae7b-049eba4736e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761363 4939 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761381 4939 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761394 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7342df2-81b0-44ac-a232-fee5911b813a-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.761729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts" (OuterVolumeSpecName: "scripts") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.765016 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm" (OuterVolumeSpecName: "kube-api-access-2zxrm") pod "e7342df2-81b0-44ac-a232-fee5911b813a" (UID: "e7342df2-81b0-44ac-a232-fee5911b813a"). InnerVolumeSpecName "kube-api-access-2zxrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.765247 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz" (OuterVolumeSpecName: "kube-api-access-gljsz") pod "21ff3991-7d14-48ca-ae7b-049eba4736e4" (UID: "21ff3991-7d14-48ca-ae7b-049eba4736e4"). InnerVolumeSpecName "kube-api-access-gljsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.862897 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.862939 4939 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7342df2-81b0-44ac-a232-fee5911b813a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.862953 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zxrm\" (UniqueName: \"kubernetes.io/projected/e7342df2-81b0-44ac-a232-fee5911b813a-kube-api-access-2zxrm\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.862967 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21ff3991-7d14-48ca-ae7b-049eba4736e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:58 crc kubenswrapper[4939]: I0318 15:59:58.862978 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gljsz\" (UniqueName: \"kubernetes.io/projected/21ff3991-7d14-48ca-ae7b-049eba4736e4-kube-api-access-gljsz\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.239778 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8kz2r" event={"ID":"21ff3991-7d14-48ca-ae7b-049eba4736e4","Type":"ContainerDied","Data":"ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2"} Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.240135 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef8dadeda036d6e6cff92f2c155b85fe22e2a4020ef454f66063e705857f8cc2" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.239787 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8kz2r" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.241641 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-7c2tf" event={"ID":"e7342df2-81b0-44ac-a232-fee5911b813a","Type":"ContainerDied","Data":"8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63"} Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.241664 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-7c2tf" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.241682 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f91fa4d85c03a117b0a9cb9d99eb9deab361e866993600c499ff23d7c8cee63" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.246215 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"81945648d6ccfe19dbfc6da6c8d0e335483a55dcd4b1766b29052a5ed772cad1"} Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.246259 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"43de3851a996ae5fb148b392f668b55f5e52a20759062bbd85dc2119439767c6"} Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.246274 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"f363212ee4f25a7395df3d0d667028ad88acc708868cd9d1bc2c2e84543c3dda"} Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.751181 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mp2sj-config-7c2tf"] Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.761026 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mp2sj-config-7c2tf"] Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.824633 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mp2sj-config-kr65w"] Mar 18 15:59:59 crc kubenswrapper[4939]: E0318 15:59:59.825044 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ff3991-7d14-48ca-ae7b-049eba4736e4" containerName="mariadb-account-create-update" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.825065 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ff3991-7d14-48ca-ae7b-049eba4736e4" containerName="mariadb-account-create-update" Mar 18 15:59:59 crc kubenswrapper[4939]: E0318 15:59:59.825093 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7342df2-81b0-44ac-a232-fee5911b813a" containerName="ovn-config" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.825101 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7342df2-81b0-44ac-a232-fee5911b813a" containerName="ovn-config" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.825283 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7342df2-81b0-44ac-a232-fee5911b813a" containerName="ovn-config" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.825311 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ff3991-7d14-48ca-ae7b-049eba4736e4" containerName="mariadb-account-create-update" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.826062 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.830798 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.836167 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj-config-kr65w"] Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.980909 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.980981 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.981006 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.981033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.981074 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvxj\" (UniqueName: \"kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 15:59:59 crc kubenswrapper[4939]: I0318 15:59:59.981103 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082019 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082401 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082445 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082463 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082490 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082548 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvxj\" (UniqueName: \"kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.082871 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.083078 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.083078 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.083281 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.084561 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.098116 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvxj\" (UniqueName: \"kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj\") pod \"ovn-controller-mp2sj-config-kr65w\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.134365 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564160-k2bl4"] Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.139484 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.146092 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.146533 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7342df2-81b0-44ac-a232-fee5911b813a" path="/var/lib/kubelet/pods/e7342df2-81b0-44ac-a232-fee5911b813a/volumes" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.148170 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-k2bl4"] Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.173567 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.173660 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.174186 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.247699 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7"] Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.256187 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.259245 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.260373 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.262461 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7"] Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.290860 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8g6\" (UniqueName: \"kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6\") pod \"auto-csr-approver-29564160-k2bl4\" (UID: \"9f623300-216b-4e06-88bc-9e7443e5bd62\") " pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.392687 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.392919 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.392987 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljf5c\" (UniqueName: \"kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.393016 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8g6\" (UniqueName: \"kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6\") pod \"auto-csr-approver-29564160-k2bl4\" (UID: \"9f623300-216b-4e06-88bc-9e7443e5bd62\") " pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.408916 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8g6\" (UniqueName: \"kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6\") pod \"auto-csr-approver-29564160-k2bl4\" (UID: \"9f623300-216b-4e06-88bc-9e7443e5bd62\") " pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.494429 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.494492 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.494601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljf5c\" (UniqueName: \"kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.496315 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.499050 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.513124 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljf5c\" (UniqueName: \"kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c\") pod \"collect-profiles-29564160-mgvq7\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.607324 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.627012 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.703399 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mp2sj-config-kr65w"] Mar 18 16:00:00 crc kubenswrapper[4939]: I0318 16:00:00.969377 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7"] Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.022035 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-k2bl4"] Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.268175 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"6516dfe8b15001572d08172fcd22e95cd68f78a081a555cea105c5a94f42e2b7"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.268582 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"49a4e678ca0aa2dc3a78eb3ed7a1fd937781bddd10e5dd6b8eefa87915967bf0"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.268599 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"5c713ae91799f792040bf3d961ee93903e389af2820041b641974aeae0138dc0"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.268612 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"d04e69f4c87e17dbd5c2a71159726ed1fbba2fdd619b794a36df627a482cb171"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.269993 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-kr65w" event={"ID":"b73d98e7-da33-40e5-9e08-6ecc4195bd32","Type":"ContainerStarted","Data":"ac2355cd40bf3621e31d049754e654f7abb1cb568a567ccfcc24ad2dd9a0396f"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.271140 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" event={"ID":"9f623300-216b-4e06-88bc-9e7443e5bd62","Type":"ContainerStarted","Data":"37da0d7b670868325ffc7e30725690e32d59083b31ade031066fe41b5e8cf2f2"} Mar 18 16:00:01 crc kubenswrapper[4939]: I0318 16:00:01.272614 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" event={"ID":"fa194de0-0ca0-4455-8b05-bc0c4f4bb012","Type":"ContainerStarted","Data":"0c44ce27e27d6475d1e59d3b316f7c4c633619de524d709ea5e4a2a6faf3e736"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.284230 4939 generic.go:334] "Generic (PLEG): container finished" podID="b73d98e7-da33-40e5-9e08-6ecc4195bd32" containerID="e5f58622c9811a365c2a3af21b7ebad75c3439fdc7d6e01b133e9a79046b8726" exitCode=0 Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.284299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-kr65w" event={"ID":"b73d98e7-da33-40e5-9e08-6ecc4195bd32","Type":"ContainerDied","Data":"e5f58622c9811a365c2a3af21b7ebad75c3439fdc7d6e01b133e9a79046b8726"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.287702 4939 generic.go:334] "Generic (PLEG): container finished" podID="fa194de0-0ca0-4455-8b05-bc0c4f4bb012" containerID="5518cf49e77b3fd1887a505d35c9a4dc8f3f7d00238b2b5bbfc9686b720de62b" exitCode=0 Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.287746 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" event={"ID":"fa194de0-0ca0-4455-8b05-bc0c4f4bb012","Type":"ContainerDied","Data":"5518cf49e77b3fd1887a505d35c9a4dc8f3f7d00238b2b5bbfc9686b720de62b"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.294341 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"e9186d27f8cbffc2e014f48debc8242aa5db203078eb8960a71ebcd03018d88d"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.294387 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"17b37072545ad8d250412a3ed598381f9883f45412fd6cc5eb64b9e9b471819c"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.294399 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerStarted","Data":"57c5515f8a5b3530a17121b168d8678bb05de6d5ca4a2d702308291f1fcd2d80"} Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.342982 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.736333296 podStartE2EDuration="37.342960349s" podCreationTimestamp="2026-03-18 15:59:25 +0000 UTC" firstStartedPulling="2026-03-18 15:59:43.372355234 +0000 UTC m=+1347.971542855" lastFinishedPulling="2026-03-18 15:59:59.978982287 +0000 UTC m=+1364.578169908" observedRunningTime="2026-03-18 16:00:02.333873827 +0000 UTC m=+1366.933061458" watchObservedRunningTime="2026-03-18 16:00:02.342960349 +0000 UTC m=+1366.942147970" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.642648 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.643954 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.646102 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.665946 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732495 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732624 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732699 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732762 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732788 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdslh\" (UniqueName: \"kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.732816 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834206 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdslh\" (UniqueName: \"kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834301 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834330 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834383 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.834626 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.835534 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.835540 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.835577 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.835773 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.835834 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.853390 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdslh\" (UniqueName: \"kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh\") pod \"dnsmasq-dns-764c5664d7-jjghp\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:02 crc kubenswrapper[4939]: I0318 16:00:02.962601 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.440740 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.806105 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.813321 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852121 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852166 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852189 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrvxj\" (UniqueName: \"kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852349 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852369 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn\") pod \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\" (UID: \"b73d98e7-da33-40e5-9e08-6ecc4195bd32\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852761 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852796 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.852810 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run" (OuterVolumeSpecName: "var-run") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.854411 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.854533 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts" (OuterVolumeSpecName: "scripts") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.860445 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj" (OuterVolumeSpecName: "kube-api-access-qrvxj") pod "b73d98e7-da33-40e5-9e08-6ecc4195bd32" (UID: "b73d98e7-da33-40e5-9e08-6ecc4195bd32"). InnerVolumeSpecName "kube-api-access-qrvxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.954150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume\") pod \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.954496 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume\") pod \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.954714 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljf5c\" (UniqueName: \"kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c\") pod \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\" (UID: \"fa194de0-0ca0-4455-8b05-bc0c4f4bb012\") " Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.954874 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa194de0-0ca0-4455-8b05-bc0c4f4bb012" (UID: "fa194de0-0ca0-4455-8b05-bc0c4f4bb012"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955317 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955349 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955369 4939 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955386 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955406 4939 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73d98e7-da33-40e5-9e08-6ecc4195bd32-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955423 4939 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b73d98e7-da33-40e5-9e08-6ecc4195bd32-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.955441 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrvxj\" (UniqueName: \"kubernetes.io/projected/b73d98e7-da33-40e5-9e08-6ecc4195bd32-kube-api-access-qrvxj\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.961770 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa194de0-0ca0-4455-8b05-bc0c4f4bb012" (UID: "fa194de0-0ca0-4455-8b05-bc0c4f4bb012"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:03 crc kubenswrapper[4939]: I0318 16:00:03.964074 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c" (OuterVolumeSpecName: "kube-api-access-ljf5c") pod "fa194de0-0ca0-4455-8b05-bc0c4f4bb012" (UID: "fa194de0-0ca0-4455-8b05-bc0c4f4bb012"). InnerVolumeSpecName "kube-api-access-ljf5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.057030 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.057074 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljf5c\" (UniqueName: \"kubernetes.io/projected/fa194de0-0ca0-4455-8b05-bc0c4f4bb012-kube-api-access-ljf5c\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.313572 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj-config-kr65w" event={"ID":"b73d98e7-da33-40e5-9e08-6ecc4195bd32","Type":"ContainerDied","Data":"ac2355cd40bf3621e31d049754e654f7abb1cb568a567ccfcc24ad2dd9a0396f"} Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.313624 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2355cd40bf3621e31d049754e654f7abb1cb568a567ccfcc24ad2dd9a0396f" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.313749 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj-config-kr65w" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.318366 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" event={"ID":"fa194de0-0ca0-4455-8b05-bc0c4f4bb012","Type":"ContainerDied","Data":"0c44ce27e27d6475d1e59d3b316f7c4c633619de524d709ea5e4a2a6faf3e736"} Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.318410 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c44ce27e27d6475d1e59d3b316f7c4c633619de524d709ea5e4a2a6faf3e736" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.318474 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7" Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.322463 4939 generic.go:334] "Generic (PLEG): container finished" podID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerID="dd35eeb7824c7e7eb1a9e0b90606e31e95956ab43d48d2229cafbed117b809ba" exitCode=0 Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.322519 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" event={"ID":"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e","Type":"ContainerDied","Data":"dd35eeb7824c7e7eb1a9e0b90606e31e95956ab43d48d2229cafbed117b809ba"} Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.322571 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" event={"ID":"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e","Type":"ContainerStarted","Data":"7f2a26d1322ea1c09f52b71348e8ea45a6bcb3ce76ff9faa3c90338458468e16"} Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.904664 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mp2sj-config-kr65w"] Mar 18 16:00:04 crc kubenswrapper[4939]: I0318 16:00:04.931260 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mp2sj-config-kr65w"] Mar 18 16:00:05 crc kubenswrapper[4939]: I0318 16:00:05.332770 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" event={"ID":"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e","Type":"ContainerStarted","Data":"a03eccb7e9916ce5fe6ed7588687d3a05db26cb96894e7505653140de5f70298"} Mar 18 16:00:05 crc kubenswrapper[4939]: I0318 16:00:05.332860 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:05 crc kubenswrapper[4939]: I0318 16:00:05.335309 4939 generic.go:334] "Generic (PLEG): container finished" podID="9f623300-216b-4e06-88bc-9e7443e5bd62" containerID="3cc0ce0b489cf01df6bd48ac72f09591afcecee614c2b58e8147daf7f614c2d1" exitCode=0 Mar 18 16:00:05 crc kubenswrapper[4939]: I0318 16:00:05.335350 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" event={"ID":"9f623300-216b-4e06-88bc-9e7443e5bd62","Type":"ContainerDied","Data":"3cc0ce0b489cf01df6bd48ac72f09591afcecee614c2b58e8147daf7f614c2d1"} Mar 18 16:00:05 crc kubenswrapper[4939]: I0318 16:00:05.356319 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" podStartSLOduration=3.356297958 podStartE2EDuration="3.356297958s" podCreationTimestamp="2026-03-18 16:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:05.352458876 +0000 UTC m=+1369.951646507" watchObservedRunningTime="2026-03-18 16:00:05.356297958 +0000 UTC m=+1369.955485579" Mar 18 16:00:06 crc kubenswrapper[4939]: I0318 16:00:06.150394 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73d98e7-da33-40e5-9e08-6ecc4195bd32" path="/var/lib/kubelet/pods/b73d98e7-da33-40e5-9e08-6ecc4195bd32/volumes" Mar 18 16:00:06 crc kubenswrapper[4939]: I0318 16:00:06.659221 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:06 crc kubenswrapper[4939]: I0318 16:00:06.700055 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8g6\" (UniqueName: \"kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6\") pod \"9f623300-216b-4e06-88bc-9e7443e5bd62\" (UID: \"9f623300-216b-4e06-88bc-9e7443e5bd62\") " Mar 18 16:00:06 crc kubenswrapper[4939]: I0318 16:00:06.708059 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6" (OuterVolumeSpecName: "kube-api-access-pb8g6") pod "9f623300-216b-4e06-88bc-9e7443e5bd62" (UID: "9f623300-216b-4e06-88bc-9e7443e5bd62"). InnerVolumeSpecName "kube-api-access-pb8g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4939]: I0318 16:00:06.801680 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8g6\" (UniqueName: \"kubernetes.io/projected/9f623300-216b-4e06-88bc-9e7443e5bd62-kube-api-access-pb8g6\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:07 crc kubenswrapper[4939]: I0318 16:00:07.357996 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" event={"ID":"9f623300-216b-4e06-88bc-9e7443e5bd62","Type":"ContainerDied","Data":"37da0d7b670868325ffc7e30725690e32d59083b31ade031066fe41b5e8cf2f2"} Mar 18 16:00:07 crc kubenswrapper[4939]: I0318 16:00:07.358068 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37da0d7b670868325ffc7e30725690e32d59083b31ade031066fe41b5e8cf2f2" Mar 18 16:00:07 crc kubenswrapper[4939]: I0318 16:00:07.358091 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-k2bl4" Mar 18 16:00:07 crc kubenswrapper[4939]: I0318 16:00:07.730049 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-7qb2m"] Mar 18 16:00:07 crc kubenswrapper[4939]: I0318 16:00:07.736779 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-7qb2m"] Mar 18 16:00:08 crc kubenswrapper[4939]: I0318 16:00:08.149400 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0153aa-53a2-47f6-9aa8-91d1dde946ec" path="/var/lib/kubelet/pods/fe0153aa-53a2-47f6-9aa8-91d1dde946ec/volumes" Mar 18 16:00:10 crc kubenswrapper[4939]: I0318 16:00:10.598777 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:00:10 crc kubenswrapper[4939]: I0318 16:00:10.894719 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 16:00:11 crc kubenswrapper[4939]: I0318 16:00:11.393637 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sss6f" event={"ID":"59e6de1a-22a0-4166-9bf0-f8844e3e89c2","Type":"ContainerStarted","Data":"4e7870f3e02bd7c6cf97984679ad07a4746c1bbcf22555811321ad8627e7b595"} Mar 18 16:00:11 crc kubenswrapper[4939]: I0318 16:00:11.415218 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sss6f" podStartSLOduration=2.599460285 podStartE2EDuration="33.415201269s" podCreationTimestamp="2026-03-18 15:59:38 +0000 UTC" firstStartedPulling="2026-03-18 15:59:39.882944775 +0000 UTC m=+1344.482132396" lastFinishedPulling="2026-03-18 16:00:10.698685759 +0000 UTC m=+1375.297873380" observedRunningTime="2026-03-18 16:00:11.411266606 +0000 UTC m=+1376.010454227" watchObservedRunningTime="2026-03-18 16:00:11.415201269 +0000 UTC m=+1376.014388890" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.447855 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-c29v8"] Mar 18 16:00:12 crc kubenswrapper[4939]: E0318 16:00:12.448588 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa194de0-0ca0-4455-8b05-bc0c4f4bb012" containerName="collect-profiles" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448601 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa194de0-0ca0-4455-8b05-bc0c4f4bb012" containerName="collect-profiles" Mar 18 16:00:12 crc kubenswrapper[4939]: E0318 16:00:12.448630 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73d98e7-da33-40e5-9e08-6ecc4195bd32" containerName="ovn-config" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448636 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73d98e7-da33-40e5-9e08-6ecc4195bd32" containerName="ovn-config" Mar 18 16:00:12 crc kubenswrapper[4939]: E0318 16:00:12.448654 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f623300-216b-4e06-88bc-9e7443e5bd62" containerName="oc" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448661 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f623300-216b-4e06-88bc-9e7443e5bd62" containerName="oc" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448835 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa194de0-0ca0-4455-8b05-bc0c4f4bb012" containerName="collect-profiles" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448851 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73d98e7-da33-40e5-9e08-6ecc4195bd32" containerName="ovn-config" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.448869 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f623300-216b-4e06-88bc-9e7443e5bd62" containerName="oc" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.449364 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.468840 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-c29v8"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.520533 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.520593 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gdz\" (UniqueName: \"kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.623056 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.623119 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gdz\" (UniqueName: \"kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.624106 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.646224 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-xwzl9"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.647526 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.650632 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gdz\" (UniqueName: \"kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz\") pod \"cinder-db-create-c29v8\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.668219 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xwzl9"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.698355 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-94b1-account-create-update-pv4z6"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.699715 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.704881 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.705157 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-94b1-account-create-update-pv4z6"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.724297 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.724390 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4b4\" (UniqueName: \"kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.724416 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.724525 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdktf\" (UniqueName: \"kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.766804 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2ncc9"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.767919 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.793791 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2ncc9"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.809148 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828000 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828076 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfjp\" (UniqueName: \"kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828155 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdktf\" (UniqueName: \"kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828252 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4b4\" (UniqueName: \"kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.828347 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.829183 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.830164 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.853517 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdktf\" (UniqueName: \"kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf\") pod \"cinder-94b1-account-create-update-pv4z6\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.864925 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4b4\" (UniqueName: \"kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4\") pod \"barbican-db-create-xwzl9\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.873916 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86d4-account-create-update-ztlll"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.876944 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.887945 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.916423 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86d4-account-create-update-ztlll"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.929597 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.929654 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfjp\" (UniqueName: \"kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.929710 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlrz4\" (UniqueName: \"kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.929785 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.930813 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.969669 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.973269 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfjp\" (UniqueName: \"kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp\") pod \"neutron-db-create-2ncc9\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.989978 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ac46-account-create-update-wf798"] Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.990991 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:12 crc kubenswrapper[4939]: I0318 16:00:12.997865 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.009496 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ac46-account-create-update-wf798"] Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.029872 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.031191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.031276 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlrz4\" (UniqueName: \"kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.031341 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.031372 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4qb\" (UniqueName: \"kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.040229 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.051150 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.063514 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vw2p8"] Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.064651 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.069386 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5prk2" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.069659 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.069691 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.069786 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.078291 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.085948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlrz4\" (UniqueName: \"kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4\") pod \"barbican-86d4-account-create-update-ztlll\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.092739 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.093873 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-d2bg8" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="dnsmasq-dns" containerID="cri-o://00cd63fb3658088fecc683853dba0e69afd4b7e3fdc1e25dabea499aedeb4916" gracePeriod=10 Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.107099 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vw2p8"] Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.132579 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.132633 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.132662 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5z2c\" (UniqueName: \"kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.132888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4qb\" (UniqueName: \"kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.132918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.133379 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.168128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4qb\" (UniqueName: \"kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb\") pod \"neutron-ac46-account-create-update-wf798\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.237413 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.237541 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.237563 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5z2c\" (UniqueName: \"kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.245741 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.246370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.258558 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.263478 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5z2c\" (UniqueName: \"kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c\") pod \"keystone-db-sync-vw2p8\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.343782 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.430901 4939 generic.go:334] "Generic (PLEG): container finished" podID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerID="00cd63fb3658088fecc683853dba0e69afd4b7e3fdc1e25dabea499aedeb4916" exitCode=0 Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.430947 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-d2bg8" event={"ID":"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5","Type":"ContainerDied","Data":"00cd63fb3658088fecc683853dba0e69afd4b7e3fdc1e25dabea499aedeb4916"} Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.470878 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.581842 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-c29v8"] Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.773923 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-94b1-account-create-update-pv4z6"] Mar 18 16:00:13 crc kubenswrapper[4939]: W0318 16:00:13.785709 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9c18dc2_e9c2_4d01_b076_207c4c21eb12.slice/crio-c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4 WatchSource:0}: Error finding container c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4: Status 404 returned error can't find the container with id c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4 Mar 18 16:00:13 crc kubenswrapper[4939]: I0318 16:00:13.958686 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.013144 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xwzl9"] Mar 18 16:00:14 crc kubenswrapper[4939]: W0318 16:00:14.021241 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecfc0fae_1947_4c07_9be5_6ce1b49d0d15.slice/crio-255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0 WatchSource:0}: Error finding container 255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0: Status 404 returned error can't find the container with id 255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0 Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.070152 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5jkg\" (UniqueName: \"kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg\") pod \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.070193 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc\") pod \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.070300 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config\") pod \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.070341 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb\") pod \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.070377 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb\") pod \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\" (UID: \"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5\") " Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.080545 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg" (OuterVolumeSpecName: "kube-api-access-p5jkg") pod "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" (UID: "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5"). InnerVolumeSpecName "kube-api-access-p5jkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.134963 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" (UID: "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.135359 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" (UID: "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.135646 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config" (OuterVolumeSpecName: "config") pod "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" (UID: "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.151205 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" (UID: "d1c6bcfa-cf4f-4389-b090-48b9b798fbe5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.175619 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.175647 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.175662 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.175673 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5jkg\" (UniqueName: \"kubernetes.io/projected/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-kube-api-access-p5jkg\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.175682 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.237907 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ac46-account-create-update-wf798"] Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.247051 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vw2p8"] Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.256010 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2ncc9"] Mar 18 16:00:14 crc kubenswrapper[4939]: W0318 16:00:14.260605 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac644e3b_085b_406a_965c_6c68407003e5.slice/crio-c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879 WatchSource:0}: Error finding container c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879: Status 404 returned error can't find the container with id c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879 Mar 18 16:00:14 crc kubenswrapper[4939]: W0318 16:00:14.260889 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6d01d9_17ba_47c1_8251_3f37cc126f2e.slice/crio-3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0 WatchSource:0}: Error finding container 3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0: Status 404 returned error can't find the container with id 3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0 Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.261875 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86d4-account-create-update-ztlll"] Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.439857 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xwzl9" event={"ID":"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15","Type":"ContainerStarted","Data":"e808f3043f3658457d8c56cf00be5a6d8cb36f7c5acd13d041c82f6da663fc56"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.441731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xwzl9" event={"ID":"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15","Type":"ContainerStarted","Data":"255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.441743 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-ztlll" event={"ID":"5f6d01d9-17ba-47c1-8251-3f37cc126f2e","Type":"ContainerStarted","Data":"3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.443565 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c29v8" event={"ID":"181cba20-17ba-4fdd-9843-e452e9e2cce9","Type":"ContainerStarted","Data":"ece5c49283a682d6e2a0125d0eab1ffd911ea301004194e9df8bdab25de3e56c"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.443611 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c29v8" event={"ID":"181cba20-17ba-4fdd-9843-e452e9e2cce9","Type":"ContainerStarted","Data":"a27a5f002cb2f4d407fe3c9378e702f0f1d272bd99d8fbd22ed8c7b324f02ee3"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.449002 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vw2p8" event={"ID":"635a2d97-8489-4a21-87ef-a30663aa441e","Type":"ContainerStarted","Data":"22e189f4c28025d48431544cf671813ebaf6560be6dc2b5929bc81ddcad582a9"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.450376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-wf798" event={"ID":"6be63e5e-e444-4c39-adac-35698d2bb045","Type":"ContainerStarted","Data":"bc192bd50e6acea41627a8ae380a74c9af169c2e6131cf648e895f964837b754"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.453152 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ncc9" event={"ID":"ac644e3b-085b-406a-965c-6c68407003e5","Type":"ContainerStarted","Data":"c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.461375 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-xwzl9" podStartSLOduration=2.4613573300000002 podStartE2EDuration="2.46135733s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:14.454487797 +0000 UTC m=+1379.053675438" watchObservedRunningTime="2026-03-18 16:00:14.46135733 +0000 UTC m=+1379.060544951" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.462029 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-d2bg8" event={"ID":"d1c6bcfa-cf4f-4389-b090-48b9b798fbe5","Type":"ContainerDied","Data":"6a4490c2948a81ac56194edd5e9044723e0bcbac8c143dadd334ed8d8772fb8b"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.462099 4939 scope.go:117] "RemoveContainer" containerID="00cd63fb3658088fecc683853dba0e69afd4b7e3fdc1e25dabea499aedeb4916" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.462310 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-d2bg8" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.471219 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-c29v8" podStartSLOduration=2.471204623 podStartE2EDuration="2.471204623s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:14.466296967 +0000 UTC m=+1379.065484588" watchObservedRunningTime="2026-03-18 16:00:14.471204623 +0000 UTC m=+1379.070392254" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.491284 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-94b1-account-create-update-pv4z6" event={"ID":"e9c18dc2-e9c2-4d01-b076-207c4c21eb12","Type":"ContainerStarted","Data":"41b415a6382e1cdc649dd7cd5f4fb88187f2b722a477e519dd3fb62d57dedb80"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.491328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-94b1-account-create-update-pv4z6" event={"ID":"e9c18dc2-e9c2-4d01-b076-207c4c21eb12","Type":"ContainerStarted","Data":"c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4"} Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.493808 4939 scope.go:117] "RemoveContainer" containerID="e05d4d95b37f1971f8f50e3c2124bf55b0dd6aa437c58136721c1a9e30107d40" Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.501167 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 16:00:14 crc kubenswrapper[4939]: I0318 16:00:14.513797 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-d2bg8"] Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.502716 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ncc9" event={"ID":"ac644e3b-085b-406a-965c-6c68407003e5","Type":"ContainerStarted","Data":"d2f402963aa9055e0082d68e353fb1235dab137baf36c27cc3d912fdc911ff21"} Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.511689 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-ztlll" event={"ID":"5f6d01d9-17ba-47c1-8251-3f37cc126f2e","Type":"ContainerStarted","Data":"6104ab35790b188976b122aece4eff76014389afebabf2beccd24e6059569113"} Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.515040 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-wf798" event={"ID":"6be63e5e-e444-4c39-adac-35698d2bb045","Type":"ContainerStarted","Data":"c84b03121f75f4cf62c653101d35986e013f48bafadd206d506406493b8e9c90"} Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.558864 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2ncc9" podStartSLOduration=3.558837348 podStartE2EDuration="3.558837348s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:15.531289206 +0000 UTC m=+1380.130476827" watchObservedRunningTime="2026-03-18 16:00:15.558837348 +0000 UTC m=+1380.158024989" Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.577175 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ac46-account-create-update-wf798" podStartSLOduration=3.5771558949999998 podStartE2EDuration="3.577155895s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:15.567471114 +0000 UTC m=+1380.166658745" watchObservedRunningTime="2026-03-18 16:00:15.577155895 +0000 UTC m=+1380.176343536" Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.578951 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-86d4-account-create-update-ztlll" podStartSLOduration=3.578943329 podStartE2EDuration="3.578943329s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:15.556880161 +0000 UTC m=+1380.156067802" watchObservedRunningTime="2026-03-18 16:00:15.578943329 +0000 UTC m=+1380.178130960" Mar 18 16:00:15 crc kubenswrapper[4939]: I0318 16:00:15.589593 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-94b1-account-create-update-pv4z6" podStartSLOduration=3.589564062 podStartE2EDuration="3.589564062s" podCreationTimestamp="2026-03-18 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:15.586919626 +0000 UTC m=+1380.186107257" watchObservedRunningTime="2026-03-18 16:00:15.589564062 +0000 UTC m=+1380.188751683" Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.153843 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" path="/var/lib/kubelet/pods/d1c6bcfa-cf4f-4389-b090-48b9b798fbe5/volumes" Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.528837 4939 generic.go:334] "Generic (PLEG): container finished" podID="181cba20-17ba-4fdd-9843-e452e9e2cce9" containerID="ece5c49283a682d6e2a0125d0eab1ffd911ea301004194e9df8bdab25de3e56c" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.529062 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c29v8" event={"ID":"181cba20-17ba-4fdd-9843-e452e9e2cce9","Type":"ContainerDied","Data":"ece5c49283a682d6e2a0125d0eab1ffd911ea301004194e9df8bdab25de3e56c"} Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.536402 4939 generic.go:334] "Generic (PLEG): container finished" podID="ac644e3b-085b-406a-965c-6c68407003e5" containerID="d2f402963aa9055e0082d68e353fb1235dab137baf36c27cc3d912fdc911ff21" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.536532 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ncc9" event={"ID":"ac644e3b-085b-406a-965c-6c68407003e5","Type":"ContainerDied","Data":"d2f402963aa9055e0082d68e353fb1235dab137baf36c27cc3d912fdc911ff21"} Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.540280 4939 generic.go:334] "Generic (PLEG): container finished" podID="ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" containerID="e808f3043f3658457d8c56cf00be5a6d8cb36f7c5acd13d041c82f6da663fc56" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4939]: I0318 16:00:16.541458 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xwzl9" event={"ID":"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15","Type":"ContainerDied","Data":"e808f3043f3658457d8c56cf00be5a6d8cb36f7c5acd13d041c82f6da663fc56"} Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.556442 4939 generic.go:334] "Generic (PLEG): container finished" podID="e9c18dc2-e9c2-4d01-b076-207c4c21eb12" containerID="41b415a6382e1cdc649dd7cd5f4fb88187f2b722a477e519dd3fb62d57dedb80" exitCode=0 Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.556476 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-94b1-account-create-update-pv4z6" event={"ID":"e9c18dc2-e9c2-4d01-b076-207c4c21eb12","Type":"ContainerDied","Data":"41b415a6382e1cdc649dd7cd5f4fb88187f2b722a477e519dd3fb62d57dedb80"} Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.558363 4939 generic.go:334] "Generic (PLEG): container finished" podID="5f6d01d9-17ba-47c1-8251-3f37cc126f2e" containerID="6104ab35790b188976b122aece4eff76014389afebabf2beccd24e6059569113" exitCode=0 Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.558457 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-ztlll" event={"ID":"5f6d01d9-17ba-47c1-8251-3f37cc126f2e","Type":"ContainerDied","Data":"6104ab35790b188976b122aece4eff76014389afebabf2beccd24e6059569113"} Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.560279 4939 generic.go:334] "Generic (PLEG): container finished" podID="6be63e5e-e444-4c39-adac-35698d2bb045" containerID="c84b03121f75f4cf62c653101d35986e013f48bafadd206d506406493b8e9c90" exitCode=0 Mar 18 16:00:17 crc kubenswrapper[4939]: I0318 16:00:17.560369 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-wf798" event={"ID":"6be63e5e-e444-4c39-adac-35698d2bb045","Type":"ContainerDied","Data":"c84b03121f75f4cf62c653101d35986e013f48bafadd206d506406493b8e9c90"} Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.578713 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xwzl9" event={"ID":"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15","Type":"ContainerDied","Data":"255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0"} Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.579163 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="255cf16af4d8fc564881d9f8bac74d39f97abcd9305c248eab3fb68c43afacd0" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.581798 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-c29v8" event={"ID":"181cba20-17ba-4fdd-9843-e452e9e2cce9","Type":"ContainerDied","Data":"a27a5f002cb2f4d407fe3c9378e702f0f1d272bd99d8fbd22ed8c7b324f02ee3"} Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.581827 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27a5f002cb2f4d407fe3c9378e702f0f1d272bd99d8fbd22ed8c7b324f02ee3" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.583468 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ncc9" event={"ID":"ac644e3b-085b-406a-965c-6c68407003e5","Type":"ContainerDied","Data":"c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879"} Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.583535 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4accad93ff722f7e06096faf34d6598c356e62e80e4cfd4348690873ba3f879" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.727528 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.743554 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.771598 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.872765 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4b4\" (UniqueName: \"kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4\") pod \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.872837 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gdz\" (UniqueName: \"kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz\") pod \"181cba20-17ba-4fdd-9843-e452e9e2cce9\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.872890 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts\") pod \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\" (UID: \"ecfc0fae-1947-4c07-9be5-6ce1b49d0d15\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.872932 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts\") pod \"181cba20-17ba-4fdd-9843-e452e9e2cce9\" (UID: \"181cba20-17ba-4fdd-9843-e452e9e2cce9\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.873010 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts\") pod \"ac644e3b-085b-406a-965c-6c68407003e5\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.873036 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnfjp\" (UniqueName: \"kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp\") pod \"ac644e3b-085b-406a-965c-6c68407003e5\" (UID: \"ac644e3b-085b-406a-965c-6c68407003e5\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.874244 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" (UID: "ecfc0fae-1947-4c07-9be5-6ce1b49d0d15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.875155 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac644e3b-085b-406a-965c-6c68407003e5" (UID: "ac644e3b-085b-406a-965c-6c68407003e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.875395 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "181cba20-17ba-4fdd-9843-e452e9e2cce9" (UID: "181cba20-17ba-4fdd-9843-e452e9e2cce9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.879132 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp" (OuterVolumeSpecName: "kube-api-access-xnfjp") pod "ac644e3b-085b-406a-965c-6c68407003e5" (UID: "ac644e3b-085b-406a-965c-6c68407003e5"). InnerVolumeSpecName "kube-api-access-xnfjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.879830 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4" (OuterVolumeSpecName: "kube-api-access-jh4b4") pod "ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" (UID: "ecfc0fae-1947-4c07-9be5-6ce1b49d0d15"). InnerVolumeSpecName "kube-api-access-jh4b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.880991 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz" (OuterVolumeSpecName: "kube-api-access-q6gdz") pod "181cba20-17ba-4fdd-9843-e452e9e2cce9" (UID: "181cba20-17ba-4fdd-9843-e452e9e2cce9"). InnerVolumeSpecName "kube-api-access-q6gdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.891350 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.925960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.949335 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.974110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x4qb\" (UniqueName: \"kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb\") pod \"6be63e5e-e444-4c39-adac-35698d2bb045\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.974194 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdktf\" (UniqueName: \"kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf\") pod \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.974310 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts\") pod \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\" (UID: \"e9c18dc2-e9c2-4d01-b076-207c4c21eb12\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.974541 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts\") pod \"6be63e5e-e444-4c39-adac-35698d2bb045\" (UID: \"6be63e5e-e444-4c39-adac-35698d2bb045\") " Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.974985 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac644e3b-085b-406a-965c-6c68407003e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975008 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnfjp\" (UniqueName: \"kubernetes.io/projected/ac644e3b-085b-406a-965c-6c68407003e5-kube-api-access-xnfjp\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975021 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4b4\" (UniqueName: \"kubernetes.io/projected/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-kube-api-access-jh4b4\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975031 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gdz\" (UniqueName: \"kubernetes.io/projected/181cba20-17ba-4fdd-9843-e452e9e2cce9-kube-api-access-q6gdz\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975040 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975050 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181cba20-17ba-4fdd-9843-e452e9e2cce9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.975448 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6be63e5e-e444-4c39-adac-35698d2bb045" (UID: "6be63e5e-e444-4c39-adac-35698d2bb045"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.978556 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf" (OuterVolumeSpecName: "kube-api-access-pdktf") pod "e9c18dc2-e9c2-4d01-b076-207c4c21eb12" (UID: "e9c18dc2-e9c2-4d01-b076-207c4c21eb12"). InnerVolumeSpecName "kube-api-access-pdktf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.978604 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9c18dc2-e9c2-4d01-b076-207c4c21eb12" (UID: "e9c18dc2-e9c2-4d01-b076-207c4c21eb12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4939]: I0318 16:00:18.986643 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb" (OuterVolumeSpecName: "kube-api-access-9x4qb") pod "6be63e5e-e444-4c39-adac-35698d2bb045" (UID: "6be63e5e-e444-4c39-adac-35698d2bb045"). InnerVolumeSpecName "kube-api-access-9x4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.075824 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts\") pod \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.075905 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlrz4\" (UniqueName: \"kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4\") pod \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\" (UID: \"5f6d01d9-17ba-47c1-8251-3f37cc126f2e\") " Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076271 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f6d01d9-17ba-47c1-8251-3f37cc126f2e" (UID: "5f6d01d9-17ba-47c1-8251-3f37cc126f2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076480 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be63e5e-e444-4c39-adac-35698d2bb045-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076555 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x4qb\" (UniqueName: \"kubernetes.io/projected/6be63e5e-e444-4c39-adac-35698d2bb045-kube-api-access-9x4qb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076570 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdktf\" (UniqueName: \"kubernetes.io/projected/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-kube-api-access-pdktf\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076580 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.076590 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c18dc2-e9c2-4d01-b076-207c4c21eb12-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.078732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4" (OuterVolumeSpecName: "kube-api-access-vlrz4") pod "5f6d01d9-17ba-47c1-8251-3f37cc126f2e" (UID: "5f6d01d9-17ba-47c1-8251-3f37cc126f2e"). InnerVolumeSpecName "kube-api-access-vlrz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.178646 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlrz4\" (UniqueName: \"kubernetes.io/projected/5f6d01d9-17ba-47c1-8251-3f37cc126f2e-kube-api-access-vlrz4\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.593371 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-94b1-account-create-update-pv4z6" event={"ID":"e9c18dc2-e9c2-4d01-b076-207c4c21eb12","Type":"ContainerDied","Data":"c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4"} Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.593674 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14b68e02bbdfb4b779f924875ad5b35febab56ddfa760e359111ad42d8e8fa4" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.593388 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-94b1-account-create-update-pv4z6" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.595427 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-ztlll" event={"ID":"5f6d01d9-17ba-47c1-8251-3f37cc126f2e","Type":"ContainerDied","Data":"3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0"} Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.595444 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-ztlll" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.595451 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4cffc2906c56d42bfde39e60dd77119e68a45422fac60f9e6090638952e3f0" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.597369 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vw2p8" event={"ID":"635a2d97-8489-4a21-87ef-a30663aa441e","Type":"ContainerStarted","Data":"907270ca149a5b20bb690bdfd59ee524001eefc89ea51cbcea166053f5e651ac"} Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599025 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-wf798" event={"ID":"6be63e5e-e444-4c39-adac-35698d2bb045","Type":"ContainerDied","Data":"bc192bd50e6acea41627a8ae380a74c9af169c2e6131cf648e895f964837b754"} Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599053 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ncc9" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599115 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-c29v8" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599118 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-wf798" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599113 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xwzl9" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.599059 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc192bd50e6acea41627a8ae380a74c9af169c2e6131cf648e895f964837b754" Mar 18 16:00:19 crc kubenswrapper[4939]: I0318 16:00:19.644923 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vw2p8" podStartSLOduration=2.314243108 podStartE2EDuration="6.644905008s" podCreationTimestamp="2026-03-18 16:00:13 +0000 UTC" firstStartedPulling="2026-03-18 16:00:14.225629172 +0000 UTC m=+1378.824816793" lastFinishedPulling="2026-03-18 16:00:18.556291072 +0000 UTC m=+1383.155478693" observedRunningTime="2026-03-18 16:00:19.618743876 +0000 UTC m=+1384.217931507" watchObservedRunningTime="2026-03-18 16:00:19.644905008 +0000 UTC m=+1384.244092629" Mar 18 16:00:22 crc kubenswrapper[4939]: I0318 16:00:22.630658 4939 generic.go:334] "Generic (PLEG): container finished" podID="635a2d97-8489-4a21-87ef-a30663aa441e" containerID="907270ca149a5b20bb690bdfd59ee524001eefc89ea51cbcea166053f5e651ac" exitCode=0 Mar 18 16:00:22 crc kubenswrapper[4939]: I0318 16:00:22.630762 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vw2p8" event={"ID":"635a2d97-8489-4a21-87ef-a30663aa441e","Type":"ContainerDied","Data":"907270ca149a5b20bb690bdfd59ee524001eefc89ea51cbcea166053f5e651ac"} Mar 18 16:00:23 crc kubenswrapper[4939]: I0318 16:00:23.640430 4939 generic.go:334] "Generic (PLEG): container finished" podID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" containerID="4e7870f3e02bd7c6cf97984679ad07a4746c1bbcf22555811321ad8627e7b595" exitCode=0 Mar 18 16:00:23 crc kubenswrapper[4939]: I0318 16:00:23.640534 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sss6f" event={"ID":"59e6de1a-22a0-4166-9bf0-f8844e3e89c2","Type":"ContainerDied","Data":"4e7870f3e02bd7c6cf97984679ad07a4746c1bbcf22555811321ad8627e7b595"} Mar 18 16:00:23 crc kubenswrapper[4939]: I0318 16:00:23.939437 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.060111 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5z2c\" (UniqueName: \"kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c\") pod \"635a2d97-8489-4a21-87ef-a30663aa441e\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.060256 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data\") pod \"635a2d97-8489-4a21-87ef-a30663aa441e\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.060300 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle\") pod \"635a2d97-8489-4a21-87ef-a30663aa441e\" (UID: \"635a2d97-8489-4a21-87ef-a30663aa441e\") " Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.065965 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c" (OuterVolumeSpecName: "kube-api-access-q5z2c") pod "635a2d97-8489-4a21-87ef-a30663aa441e" (UID: "635a2d97-8489-4a21-87ef-a30663aa441e"). InnerVolumeSpecName "kube-api-access-q5z2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.085681 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "635a2d97-8489-4a21-87ef-a30663aa441e" (UID: "635a2d97-8489-4a21-87ef-a30663aa441e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.103020 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data" (OuterVolumeSpecName: "config-data") pod "635a2d97-8489-4a21-87ef-a30663aa441e" (UID: "635a2d97-8489-4a21-87ef-a30663aa441e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.162264 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5z2c\" (UniqueName: \"kubernetes.io/projected/635a2d97-8489-4a21-87ef-a30663aa441e-kube-api-access-q5z2c\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.162297 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.162310 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635a2d97-8489-4a21-87ef-a30663aa441e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.650906 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vw2p8" event={"ID":"635a2d97-8489-4a21-87ef-a30663aa441e","Type":"ContainerDied","Data":"22e189f4c28025d48431544cf671813ebaf6560be6dc2b5929bc81ddcad582a9"} Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.651208 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e189f4c28025d48431544cf671813ebaf6560be6dc2b5929bc81ddcad582a9" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.650937 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vw2p8" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.897950 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898290 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac644e3b-085b-406a-965c-6c68407003e5" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898307 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac644e3b-085b-406a-965c-6c68407003e5" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898319 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be63e5e-e444-4c39-adac-35698d2bb045" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898325 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be63e5e-e444-4c39-adac-35698d2bb045" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898337 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635a2d97-8489-4a21-87ef-a30663aa441e" containerName="keystone-db-sync" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898343 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="635a2d97-8489-4a21-87ef-a30663aa441e" containerName="keystone-db-sync" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898354 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c18dc2-e9c2-4d01-b076-207c4c21eb12" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898361 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c18dc2-e9c2-4d01-b076-207c4c21eb12" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898370 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898375 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898388 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="init" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898394 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="init" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898402 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181cba20-17ba-4fdd-9843-e452e9e2cce9" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898408 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="181cba20-17ba-4fdd-9843-e452e9e2cce9" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898415 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6d01d9-17ba-47c1-8251-3f37cc126f2e" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898421 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6d01d9-17ba-47c1-8251-3f37cc126f2e" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: E0318 16:00:24.898437 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="dnsmasq-dns" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898443 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="dnsmasq-dns" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898634 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c18dc2-e9c2-4d01-b076-207c4c21eb12" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898647 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c6bcfa-cf4f-4389-b090-48b9b798fbe5" containerName="dnsmasq-dns" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898658 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be63e5e-e444-4c39-adac-35698d2bb045" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898667 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac644e3b-085b-406a-965c-6c68407003e5" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898679 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="181cba20-17ba-4fdd-9843-e452e9e2cce9" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898685 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" containerName="mariadb-database-create" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898692 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6d01d9-17ba-47c1-8251-3f37cc126f2e" containerName="mariadb-account-create-update" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.898698 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="635a2d97-8489-4a21-87ef-a30663aa441e" containerName="keystone-db-sync" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.899470 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.912617 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.960233 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mnbf9"] Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.961564 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.963938 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5prk2" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.964495 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.964566 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.964661 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.964917 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.981898 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvgh\" (UniqueName: \"kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.981989 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.982033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.982105 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.982163 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.982185 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:24 crc kubenswrapper[4939]: I0318 16:00:24.982996 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mnbf9"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.084846 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znvgh\" (UniqueName: \"kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.084922 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.084949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085006 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085066 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4pd4\" (UniqueName: \"kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085094 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085120 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085156 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085198 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085223 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.085295 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.086151 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.086210 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.087049 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.087047 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.087228 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.108418 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvgh\" (UniqueName: \"kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh\") pod \"dnsmasq-dns-5959f8865f-8bs62\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.151738 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.153858 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.161397 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.163394 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pxrk6"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.164647 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.180369 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pxrk6"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187322 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187397 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187455 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187565 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187593 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.187682 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4pd4\" (UniqueName: \"kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.194570 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.200177 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.204048 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.204459 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.204613 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2vzwd" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.204829 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.205136 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.210437 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.216644 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.220252 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.220275 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.241092 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4pd4\" (UniqueName: \"kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4\") pod \"keystone-bootstrap-mnbf9\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.280961 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.283408 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qkn72"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.284606 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.286378 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.286866 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zdtsm" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.287030 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293173 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293241 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293303 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293331 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9zhl\" (UniqueName: \"kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293376 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293463 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rs4\" (UniqueName: \"kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293533 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293580 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293680 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.293711 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.374918 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qkn72"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.388822 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-qvp84"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.390198 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.391234 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sss6f" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.395357 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.395552 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-98jnl" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398128 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398174 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rs4\" (UniqueName: \"kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398208 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398236 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398253 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398288 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398317 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398337 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398361 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398393 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398528 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398551 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2ffv\" (UniqueName: \"kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398578 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398621 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9zhl\" (UniqueName: \"kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398649 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.398696 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.402665 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.411323 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.414768 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.415655 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.415685 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.418866 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.419034 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.419914 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.425469 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ms4t8"] Mar 18 16:00:25 crc kubenswrapper[4939]: E0318 16:00:25.425938 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" containerName="glance-db-sync" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.425963 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" containerName="glance-db-sync" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.426203 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" containerName="glance-db-sync" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.426761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.436144 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.436304 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.436397 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sgjh9" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.443836 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rs4\" (UniqueName: \"kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4\") pod \"neutron-db-sync-pxrk6\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.446634 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9zhl\" (UniqueName: \"kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl\") pod \"ceilometer-0\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.446712 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qvp84"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.458779 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.479887 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ms4t8"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.487777 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.499493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data\") pod \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.499635 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle\") pod \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.499726 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data\") pod \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.499799 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwlx2\" (UniqueName: \"kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2\") pod \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\" (UID: \"59e6de1a-22a0-4166-9bf0-f8844e3e89c2\") " Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500099 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500133 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500169 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500195 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglw8\" (UniqueName: \"kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500250 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500276 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500362 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500411 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8wr\" (UniqueName: \"kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500439 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500491 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2ffv\" (UniqueName: \"kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500557 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500609 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500643 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.500768 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.506189 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "59e6de1a-22a0-4166-9bf0-f8844e3e89c2" (UID: "59e6de1a-22a0-4166-9bf0-f8844e3e89c2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.508841 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.510302 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.512538 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.514181 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.514614 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.516789 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.518597 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.527560 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2" (OuterVolumeSpecName: "kube-api-access-fwlx2") pod "59e6de1a-22a0-4166-9bf0-f8844e3e89c2" (UID: "59e6de1a-22a0-4166-9bf0-f8844e3e89c2"). InnerVolumeSpecName "kube-api-access-fwlx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.531137 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2ffv\" (UniqueName: \"kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv\") pod \"cinder-db-sync-qkn72\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.561854 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e6de1a-22a0-4166-9bf0-f8844e3e89c2" (UID: "59e6de1a-22a0-4166-9bf0-f8844e3e89c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.595598 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data" (OuterVolumeSpecName: "config-data") pod "59e6de1a-22a0-4166-9bf0-f8844e3e89c2" (UID: "59e6de1a-22a0-4166-9bf0-f8844e3e89c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.602609 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.602693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.602988 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603022 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603131 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603170 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603196 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zglw8\" (UniqueName: \"kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603221 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603247 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603269 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603304 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rrs\" (UniqueName: \"kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603342 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603400 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8wr\" (UniqueName: \"kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603425 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603479 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603495 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603528 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.603540 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwlx2\" (UniqueName: \"kubernetes.io/projected/59e6de1a-22a0-4166-9bf0-f8844e3e89c2-kube-api-access-fwlx2\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.605572 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.606800 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.621283 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.621439 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.621531 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.623945 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.626373 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8wr\" (UniqueName: \"kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr\") pod \"placement-db-sync-ms4t8\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.628099 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglw8\" (UniqueName: \"kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8\") pod \"barbican-db-sync-qvp84\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.664145 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sss6f" event={"ID":"59e6de1a-22a0-4166-9bf0-f8844e3e89c2","Type":"ContainerDied","Data":"24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d"} Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.664187 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24edcccb4b25dd53d090b5dbe73a12203a3ca8da4863ad57321499b7cecf3d7d" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.664239 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sss6f" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705033 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rrs\" (UniqueName: \"kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705087 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705134 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705152 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705196 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.705258 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.706067 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.706989 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.707557 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.708069 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.708381 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.731695 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.732981 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rrs\" (UniqueName: \"kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs\") pod \"dnsmasq-dns-58dd9ff6bc-bkf6s\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.785859 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkn72" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.811540 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qvp84" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.813343 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.856070 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.907380 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:25 crc kubenswrapper[4939]: I0318 16:00:25.993117 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.013877 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.036565 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.036758 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.076588 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mnbf9"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125241 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96hf2\" (UniqueName: \"kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125288 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125397 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125427 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125465 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.125517 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.197238 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pxrk6"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.227484 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.227554 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.227596 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.227660 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96hf2\" (UniqueName: \"kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.235312 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.235850 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.229554 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.228610 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.228784 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.237285 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.246084 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.278055 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96hf2\" (UniqueName: \"kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2\") pod \"dnsmasq-dns-785d8bcb8c-kx7lw\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.353971 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:26 crc kubenswrapper[4939]: W0318 16:00:26.366966 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea13ebc5_52f6_4371_9261_92ebd07f0663.slice/crio-e3a9ed5e21e7fc313dbcb5d5cb0946fa68c34884d2b5d6b093e1e03958b33699 WatchSource:0}: Error finding container e3a9ed5e21e7fc313dbcb5d5cb0946fa68c34884d2b5d6b093e1e03958b33699: Status 404 returned error can't find the container with id e3a9ed5e21e7fc313dbcb5d5cb0946fa68c34884d2b5d6b093e1e03958b33699 Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.397396 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.528930 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qvp84"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.620057 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ms4t8"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.627362 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qkn72"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.689343 4939 generic.go:334] "Generic (PLEG): container finished" podID="cd5111f3-3624-4854-9e57-c5b469d0f8bd" containerID="a1515ec4b6f6d8de8b38eaa2ff6fa53b1565199a758deb7fbb2e0d7967bfa7d3" exitCode=0 Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.689393 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" event={"ID":"cd5111f3-3624-4854-9e57-c5b469d0f8bd","Type":"ContainerDied","Data":"a1515ec4b6f6d8de8b38eaa2ff6fa53b1565199a758deb7fbb2e0d7967bfa7d3"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.689418 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" event={"ID":"cd5111f3-3624-4854-9e57-c5b469d0f8bd","Type":"ContainerStarted","Data":"eb4975c10092718de854898b048c339b134b9050bee891ceac7f872b5786bd1b"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.700475 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkn72" event={"ID":"ed545871-ed70-4d38-830a-8a6131455769","Type":"ContainerStarted","Data":"2485534b9efbd45946c75134fd955acfc10aeb5318dfb04e27e2538e9fc40655"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.704018 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerStarted","Data":"e3a9ed5e21e7fc313dbcb5d5cb0946fa68c34884d2b5d6b093e1e03958b33699"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.720270 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.720529 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms4t8" event={"ID":"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86","Type":"ContainerStarted","Data":"d1682b569ffbc9039ec9e59db7dd07e1590a1f31aeb36ad596bc6d7614ce2355"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.731231 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pxrk6" event={"ID":"10912abd-378d-4dd0-abf1-092a5e7d7043","Type":"ContainerStarted","Data":"18de2267ef895322ef9b35671a21963af69c81f3a74b753b0ee5412e9a974231"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.731270 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pxrk6" event={"ID":"10912abd-378d-4dd0-abf1-092a5e7d7043","Type":"ContainerStarted","Data":"a292f0588022315ec3799772c7c47204c2d8e77d82f46c79ae7339eca7ea21c3"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.735620 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnbf9" event={"ID":"ed370de0-3dea-4bd6-aea2-6de62f548683","Type":"ContainerStarted","Data":"39da98db91664b5c7f2fea44fad6849e5c9b51f8e829793975aadf713831a221"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.735656 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnbf9" event={"ID":"ed370de0-3dea-4bd6-aea2-6de62f548683","Type":"ContainerStarted","Data":"2aa4fe72f2d20469a342e094835d40998f83ea85e6a5b0e6db200850497ba013"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.741118 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qvp84" event={"ID":"0ca6e14d-75ec-40af-9670-c413af1391df","Type":"ContainerStarted","Data":"067367095e2219c250ba9054b67c161991e660a981b9519d856d27fe734f79ac"} Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.747341 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pxrk6" podStartSLOduration=1.747327439 podStartE2EDuration="1.747327439s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:26.744554972 +0000 UTC m=+1391.343742633" watchObservedRunningTime="2026-03-18 16:00:26.747327439 +0000 UTC m=+1391.346515060" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.794202 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mnbf9" podStartSLOduration=2.794180291 podStartE2EDuration="2.794180291s" podCreationTimestamp="2026-03-18 16:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:26.773682515 +0000 UTC m=+1391.372870156" watchObservedRunningTime="2026-03-18 16:00:26.794180291 +0000 UTC m=+1391.393367912" Mar 18 16:00:26 crc kubenswrapper[4939]: I0318 16:00:26.929590 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:00:26 crc kubenswrapper[4939]: W0318 16:00:26.967889 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2695722d_fc7c_4965_ae10_f20e016051d3.slice/crio-0bbbdf779f5a4b3cbd40fccdfd67b1e473397c9c78c9b427644367e365eecaf7 WatchSource:0}: Error finding container 0bbbdf779f5a4b3cbd40fccdfd67b1e473397c9c78c9b427644367e365eecaf7: Status 404 returned error can't find the container with id 0bbbdf779f5a4b3cbd40fccdfd67b1e473397c9c78c9b427644367e365eecaf7 Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.008035 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.015272 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.018750 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7rbvp" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.019300 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.023227 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.039681 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.053936 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.163827 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164234 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164539 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znvgh\" (UniqueName: \"kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164569 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164617 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb\") pod \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\" (UID: \"cd5111f3-3624-4854-9e57-c5b469d0f8bd\") " Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164877 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164915 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.164985 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.165014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.165042 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.165090 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8bn\" (UniqueName: \"kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.165143 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.193831 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh" (OuterVolumeSpecName: "kube-api-access-znvgh") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "kube-api-access-znvgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.210105 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.214727 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:27 crc kubenswrapper[4939]: E0318 16:00:27.215122 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5111f3-3624-4854-9e57-c5b469d0f8bd" containerName="init" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.215138 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5111f3-3624-4854-9e57-c5b469d0f8bd" containerName="init" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.216889 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5111f3-3624-4854-9e57-c5b469d0f8bd" containerName="init" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.217987 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.220831 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.222842 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config" (OuterVolumeSpecName: "config") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.226853 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.227808 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.244367 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.245227 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd5111f3-3624-4854-9e57-c5b469d0f8bd" (UID: "cd5111f3-3624-4854-9e57-c5b469d0f8bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.273627 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.273717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.273814 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8bn\" (UniqueName: \"kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.273915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274087 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274128 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274255 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274327 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znvgh\" (UniqueName: \"kubernetes.io/projected/cd5111f3-3624-4854-9e57-c5b469d0f8bd-kube-api-access-znvgh\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274345 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274359 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274372 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274384 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274394 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd5111f3-3624-4854-9e57-c5b469d0f8bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.274728 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.276613 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.279157 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.283918 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.284336 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.297915 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8bn\" (UniqueName: \"kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.309330 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.324901 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.353226 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.376134 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.376205 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.376226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.376259 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.376287 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.377719 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5nlg\" (UniqueName: \"kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.377842 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480032 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5nlg\" (UniqueName: \"kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480146 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480171 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480193 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480215 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480231 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.480357 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.482164 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.482970 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.496662 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.527438 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.537868 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.539290 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5nlg\" (UniqueName: \"kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.553554 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.704582 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.758917 4939 generic.go:334] "Generic (PLEG): container finished" podID="2221cdac-d183-4363-ba92-a2de3f333b1d" containerID="dadd80ebd4d705ba1d6796292f7e6f66d11f6cedc0e6123ef23127da2ab4c72a" exitCode=0 Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.758993 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" event={"ID":"2221cdac-d183-4363-ba92-a2de3f333b1d","Type":"ContainerDied","Data":"dadd80ebd4d705ba1d6796292f7e6f66d11f6cedc0e6123ef23127da2ab4c72a"} Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.759040 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" event={"ID":"2221cdac-d183-4363-ba92-a2de3f333b1d","Type":"ContainerStarted","Data":"c1b75eb0e416d02b801f85cdbb2cb3e09ba59c71278a5de3333ce4c41e07c35f"} Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.773368 4939 generic.go:334] "Generic (PLEG): container finished" podID="2695722d-fc7c-4965-ae10-f20e016051d3" containerID="fac5b8602469e3a90d113bb1fb4bdc434c1e3ba2c6a1cd7281dc6f7d8c39ae8d" exitCode=0 Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.773416 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" event={"ID":"2695722d-fc7c-4965-ae10-f20e016051d3","Type":"ContainerDied","Data":"fac5b8602469e3a90d113bb1fb4bdc434c1e3ba2c6a1cd7281dc6f7d8c39ae8d"} Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.773467 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" event={"ID":"2695722d-fc7c-4965-ae10-f20e016051d3","Type":"ContainerStarted","Data":"0bbbdf779f5a4b3cbd40fccdfd67b1e473397c9c78c9b427644367e365eecaf7"} Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.776314 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.776630 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-8bs62" event={"ID":"cd5111f3-3624-4854-9e57-c5b469d0f8bd","Type":"ContainerDied","Data":"eb4975c10092718de854898b048c339b134b9050bee891ceac7f872b5786bd1b"} Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.776670 4939 scope.go:117] "RemoveContainer" containerID="a1515ec4b6f6d8de8b38eaa2ff6fa53b1565199a758deb7fbb2e0d7967bfa7d3" Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.893384 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:27 crc kubenswrapper[4939]: I0318 16:00:27.902941 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-8bs62"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.182623 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5111f3-3624-4854-9e57-c5b469d0f8bd" path="/var/lib/kubelet/pods/cd5111f3-3624-4854-9e57-c5b469d0f8bd/volumes" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.184685 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.257982 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.350439 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.388467 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.403698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.403926 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.404034 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4rrs\" (UniqueName: \"kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.404367 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.404435 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.404457 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config\") pod \"2221cdac-d183-4363-ba92-a2de3f333b1d\" (UID: \"2221cdac-d183-4363-ba92-a2de3f333b1d\") " Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.431405 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs" (OuterVolumeSpecName: "kube-api-access-j4rrs") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "kube-api-access-j4rrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.449616 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.450856 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.450893 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.457194 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.457300 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config" (OuterVolumeSpecName: "config") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.474546 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2221cdac-d183-4363-ba92-a2de3f333b1d" (UID: "2221cdac-d183-4363-ba92-a2de3f333b1d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507491 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507542 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4rrs\" (UniqueName: \"kubernetes.io/projected/2221cdac-d183-4363-ba92-a2de3f333b1d-kube-api-access-j4rrs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507553 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507562 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507574 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.507584 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2221cdac-d183-4363-ba92-a2de3f333b1d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.555317 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.830545 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" event={"ID":"2695722d-fc7c-4965-ae10-f20e016051d3","Type":"ContainerStarted","Data":"bab1bdac7d712472395e845130fa5c2396afaa42a2d98b888bbd0dcfa6ba492d"} Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.832024 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.835136 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerStarted","Data":"474787de23028704e52ac307b00c5f09fad97982468906cd4420034c5e3a67b7"} Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.860885 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" podStartSLOduration=3.860866606 podStartE2EDuration="3.860866606s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:28.855135569 +0000 UTC m=+1393.454323190" watchObservedRunningTime="2026-03-18 16:00:28.860866606 +0000 UTC m=+1393.460054227" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.868458 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" event={"ID":"2221cdac-d183-4363-ba92-a2de3f333b1d","Type":"ContainerDied","Data":"c1b75eb0e416d02b801f85cdbb2cb3e09ba59c71278a5de3333ce4c41e07c35f"} Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.868601 4939 scope.go:117] "RemoveContainer" containerID="dadd80ebd4d705ba1d6796292f7e6f66d11f6cedc0e6123ef23127da2ab4c72a" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.868702 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-bkf6s" Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.872775 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerStarted","Data":"b90927873175c61455f23aa06876b20cf0281afe9202f8841567de0e955ea979"} Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.954457 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:28 crc kubenswrapper[4939]: I0318 16:00:28.974603 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-bkf6s"] Mar 18 16:00:29 crc kubenswrapper[4939]: I0318 16:00:29.903956 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerStarted","Data":"80eb66bc6ba29abe33932e55b6b91293422afcddc11a417a80e0c01e5ab3dba3"} Mar 18 16:00:29 crc kubenswrapper[4939]: I0318 16:00:29.911934 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerStarted","Data":"42ee2127bc503184af53da0c011f66601403c9535b1c488d82d0383514bb55c9"} Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.154543 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2221cdac-d183-4363-ba92-a2de3f333b1d" path="/var/lib/kubelet/pods/2221cdac-d183-4363-ba92-a2de3f333b1d/volumes" Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.932933 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerStarted","Data":"f7a8f72333780ba33a4075f66ecebd7a943df98d8425f1b2df867155f003854e"} Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.933089 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-log" containerID="cri-o://42ee2127bc503184af53da0c011f66601403c9535b1c488d82d0383514bb55c9" gracePeriod=30 Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.933376 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-httpd" containerID="cri-o://f7a8f72333780ba33a4075f66ecebd7a943df98d8425f1b2df867155f003854e" gracePeriod=30 Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.936078 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerStarted","Data":"fd2ba35cca3c4a50e7b30e58a247fc501b09d30b33f3afecf7f97d78f769408b"} Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.936400 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-httpd" containerID="cri-o://fd2ba35cca3c4a50e7b30e58a247fc501b09d30b33f3afecf7f97d78f769408b" gracePeriod=30 Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.936680 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-log" containerID="cri-o://80eb66bc6ba29abe33932e55b6b91293422afcddc11a417a80e0c01e5ab3dba3" gracePeriod=30 Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.985589 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.985566373 podStartE2EDuration="5.985566373s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:30.956833485 +0000 UTC m=+1395.556021096" watchObservedRunningTime="2026-03-18 16:00:30.985566373 +0000 UTC m=+1395.584754004" Mar 18 16:00:30 crc kubenswrapper[4939]: I0318 16:00:30.996970 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.996946926 podStartE2EDuration="4.996946926s" podCreationTimestamp="2026-03-18 16:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:30.983446134 +0000 UTC m=+1395.582633755" watchObservedRunningTime="2026-03-18 16:00:30.996946926 +0000 UTC m=+1395.596134547" Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.963062 4939 generic.go:334] "Generic (PLEG): container finished" podID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerID="f7a8f72333780ba33a4075f66ecebd7a943df98d8425f1b2df867155f003854e" exitCode=0 Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.963608 4939 generic.go:334] "Generic (PLEG): container finished" podID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerID="42ee2127bc503184af53da0c011f66601403c9535b1c488d82d0383514bb55c9" exitCode=143 Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.963142 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerDied","Data":"f7a8f72333780ba33a4075f66ecebd7a943df98d8425f1b2df867155f003854e"} Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.963701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerDied","Data":"42ee2127bc503184af53da0c011f66601403c9535b1c488d82d0383514bb55c9"} Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.966496 4939 generic.go:334] "Generic (PLEG): container finished" podID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerID="fd2ba35cca3c4a50e7b30e58a247fc501b09d30b33f3afecf7f97d78f769408b" exitCode=0 Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.966562 4939 generic.go:334] "Generic (PLEG): container finished" podID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerID="80eb66bc6ba29abe33932e55b6b91293422afcddc11a417a80e0c01e5ab3dba3" exitCode=143 Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.966636 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerDied","Data":"fd2ba35cca3c4a50e7b30e58a247fc501b09d30b33f3afecf7f97d78f769408b"} Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.966678 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerDied","Data":"80eb66bc6ba29abe33932e55b6b91293422afcddc11a417a80e0c01e5ab3dba3"} Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.968949 4939 generic.go:334] "Generic (PLEG): container finished" podID="ed370de0-3dea-4bd6-aea2-6de62f548683" containerID="39da98db91664b5c7f2fea44fad6849e5c9b51f8e829793975aadf713831a221" exitCode=0 Mar 18 16:00:31 crc kubenswrapper[4939]: I0318 16:00:31.968985 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnbf9" event={"ID":"ed370de0-3dea-4bd6-aea2-6de62f548683","Type":"ContainerDied","Data":"39da98db91664b5c7f2fea44fad6849e5c9b51f8e829793975aadf713831a221"} Mar 18 16:00:36 crc kubenswrapper[4939]: I0318 16:00:36.398752 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:00:36 crc kubenswrapper[4939]: I0318 16:00:36.470642 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:36 crc kubenswrapper[4939]: I0318 16:00:36.470944 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" containerID="cri-o://a03eccb7e9916ce5fe6ed7588687d3a05db26cb96894e7505653140de5f70298" gracePeriod=10 Mar 18 16:00:37 crc kubenswrapper[4939]: I0318 16:00:37.029476 4939 generic.go:334] "Generic (PLEG): container finished" podID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerID="a03eccb7e9916ce5fe6ed7588687d3a05db26cb96894e7505653140de5f70298" exitCode=0 Mar 18 16:00:37 crc kubenswrapper[4939]: I0318 16:00:37.029545 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" event={"ID":"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e","Type":"ContainerDied","Data":"a03eccb7e9916ce5fe6ed7588687d3a05db26cb96894e7505653140de5f70298"} Mar 18 16:00:37 crc kubenswrapper[4939]: I0318 16:00:37.963727 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Mar 18 16:00:38 crc kubenswrapper[4939]: I0318 16:00:38.179462 4939 scope.go:117] "RemoveContainer" containerID="bf9f56148ff4648ecf2f36b938a50e358d0a841cfa292cf1d512d131dcd1e6e7" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.062330 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnbf9" event={"ID":"ed370de0-3dea-4bd6-aea2-6de62f548683","Type":"ContainerDied","Data":"2aa4fe72f2d20469a342e094835d40998f83ea85e6a5b0e6db200850497ba013"} Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.062805 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa4fe72f2d20469a342e094835d40998f83ea85e6a5b0e6db200850497ba013" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.106300 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176238 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4pd4\" (UniqueName: \"kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176291 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176483 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176559 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176677 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.176712 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts\") pod \"ed370de0-3dea-4bd6-aea2-6de62f548683\" (UID: \"ed370de0-3dea-4bd6-aea2-6de62f548683\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.184194 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.184244 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.189107 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts" (OuterVolumeSpecName: "scripts") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.189410 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4" (OuterVolumeSpecName: "kube-api-access-r4pd4") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "kube-api-access-r4pd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.211543 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data" (OuterVolumeSpecName: "config-data") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.215531 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed370de0-3dea-4bd6-aea2-6de62f548683" (UID: "ed370de0-3dea-4bd6-aea2-6de62f548683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280096 4939 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280172 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280186 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280197 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4pd4\" (UniqueName: \"kubernetes.io/projected/ed370de0-3dea-4bd6-aea2-6de62f548683-kube-api-access-r4pd4\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280208 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.280217 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed370de0-3dea-4bd6-aea2-6de62f548683-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: E0318 16:00:40.817385 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 16:00:40 crc kubenswrapper[4939]: E0318 16:00:40.818187 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zglw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-qvp84_openstack(0ca6e14d-75ec-40af-9670-c413af1391df): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 16:00:40 crc kubenswrapper[4939]: E0318 16:00:40.820028 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-qvp84" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.853957 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920224 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920314 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920351 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg8bn\" (UniqueName: \"kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920398 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920595 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920627 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920672 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\" (UID: \"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0\") " Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920833 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.920855 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs" (OuterVolumeSpecName: "logs") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.921408 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.921432 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.927690 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.927779 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn" (OuterVolumeSpecName: "kube-api-access-vg8bn") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "kube-api-access-vg8bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.928335 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts" (OuterVolumeSpecName: "scripts") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.949035 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:40 crc kubenswrapper[4939]: I0318 16:00:40.975609 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data" (OuterVolumeSpecName: "config-data") pod "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" (UID: "731e5d58-5ec1-406d-aa8b-ac91aa22b8b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.023255 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.023296 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg8bn\" (UniqueName: \"kubernetes.io/projected/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-kube-api-access-vg8bn\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.023305 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.023313 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.023346 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.042706 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.094422 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.094459 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"731e5d58-5ec1-406d-aa8b-ac91aa22b8b0","Type":"ContainerDied","Data":"b90927873175c61455f23aa06876b20cf0281afe9202f8841567de0e955ea979"} Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.094494 4939 scope.go:117] "RemoveContainer" containerID="f7a8f72333780ba33a4075f66ecebd7a943df98d8425f1b2df867155f003854e" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.095762 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnbf9" Mar 18 16:00:41 crc kubenswrapper[4939]: E0318 16:00:41.100396 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-qvp84" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.126411 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.149850 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.169627 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.219858 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:41 crc kubenswrapper[4939]: E0318 16:00:41.220216 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-log" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220231 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-log" Mar 18 16:00:41 crc kubenswrapper[4939]: E0318 16:00:41.220247 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221cdac-d183-4363-ba92-a2de3f333b1d" containerName="init" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220253 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221cdac-d183-4363-ba92-a2de3f333b1d" containerName="init" Mar 18 16:00:41 crc kubenswrapper[4939]: E0318 16:00:41.220268 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed370de0-3dea-4bd6-aea2-6de62f548683" containerName="keystone-bootstrap" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220274 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed370de0-3dea-4bd6-aea2-6de62f548683" containerName="keystone-bootstrap" Mar 18 16:00:41 crc kubenswrapper[4939]: E0318 16:00:41.220289 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-httpd" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220294 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-httpd" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220441 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221cdac-d183-4363-ba92-a2de3f333b1d" containerName="init" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220453 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed370de0-3dea-4bd6-aea2-6de62f548683" containerName="keystone-bootstrap" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220459 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-log" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.220472 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" containerName="glance-httpd" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.221467 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.224123 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.225189 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.227177 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.236083 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mnbf9"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.253761 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mnbf9"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.309608 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8brg8"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.314357 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.317062 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5prk2" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.319030 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.319254 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.320608 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.327247 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.332662 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8brg8"] Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338154 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338202 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338224 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338248 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338262 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338286 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338311 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wsmv\" (UniqueName: \"kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.338352 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.440792 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.440872 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.440918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wsmv\" (UniqueName: \"kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.440963 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441057 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441131 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2g5w\" (UniqueName: \"kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441159 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441186 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441217 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441246 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441263 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441323 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441580 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.441789 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.445709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.446019 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.446068 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.446357 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.459624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wsmv\" (UniqueName: \"kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.476200 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.537876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542406 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542490 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542539 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2g5w\" (UniqueName: \"kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542567 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542600 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.542681 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.546995 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.547667 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.548308 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.549228 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.555482 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.561674 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2g5w\" (UniqueName: \"kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w\") pod \"keystone-bootstrap-8brg8\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:41 crc kubenswrapper[4939]: I0318 16:00:41.638243 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:42 crc kubenswrapper[4939]: I0318 16:00:42.147407 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731e5d58-5ec1-406d-aa8b-ac91aa22b8b0" path="/var/lib/kubelet/pods/731e5d58-5ec1-406d-aa8b-ac91aa22b8b0/volumes" Mar 18 16:00:42 crc kubenswrapper[4939]: I0318 16:00:42.148859 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed370de0-3dea-4bd6-aea2-6de62f548683" path="/var/lib/kubelet/pods/ed370de0-3dea-4bd6-aea2-6de62f548683/volumes" Mar 18 16:00:42 crc kubenswrapper[4939]: I0318 16:00:42.964178 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Mar 18 16:00:47 crc kubenswrapper[4939]: I0318 16:00:47.963907 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Mar 18 16:00:47 crc kubenswrapper[4939]: I0318 16:00:47.964636 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.541794 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590527 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590719 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590745 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5nlg\" (UniqueName: \"kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590882 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590911 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.590975 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\" (UID: \"b48cae53-0eb8-42aa-a730-98d9e33bcbe4\") " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.592163 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.592485 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs" (OuterVolumeSpecName: "logs") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.600737 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts" (OuterVolumeSpecName: "scripts") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.600821 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.600930 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg" (OuterVolumeSpecName: "kube-api-access-n5nlg") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "kube-api-access-n5nlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.648838 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.655686 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data" (OuterVolumeSpecName: "config-data") pod "b48cae53-0eb8-42aa-a730-98d9e33bcbe4" (UID: "b48cae53-0eb8-42aa-a730-98d9e33bcbe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693238 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693279 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693294 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693308 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5nlg\" (UniqueName: \"kubernetes.io/projected/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-kube-api-access-n5nlg\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693320 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693331 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48cae53-0eb8-42aa-a730-98d9e33bcbe4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.693367 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.714071 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 16:00:48 crc kubenswrapper[4939]: I0318 16:00:48.794979 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.199312 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.199281 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b48cae53-0eb8-42aa-a730-98d9e33bcbe4","Type":"ContainerDied","Data":"474787de23028704e52ac307b00c5f09fad97982468906cd4420034c5e3a67b7"} Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.234499 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.241785 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.261162 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:49 crc kubenswrapper[4939]: E0318 16:00:49.261812 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-httpd" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.261887 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-httpd" Mar 18 16:00:49 crc kubenswrapper[4939]: E0318 16:00:49.261956 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-log" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.262006 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-log" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.262222 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-httpd" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.262295 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" containerName="glance-log" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.263260 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.268337 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.268547 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.269872 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306619 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306699 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306731 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzgj\" (UniqueName: \"kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306772 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306828 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306855 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306870 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.306906 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.408623 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409169 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409300 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzgj\" (UniqueName: \"kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409404 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409530 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409625 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409637 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409701 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.409733 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.408858 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.411272 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.415134 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.415320 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.418829 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.432185 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.435645 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzgj\" (UniqueName: \"kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.447556 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.599108 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.778144 4939 scope.go:117] "RemoveContainer" containerID="42ee2127bc503184af53da0c011f66601403c9535b1c488d82d0383514bb55c9" Mar 18 16:00:49 crc kubenswrapper[4939]: E0318 16:00:49.790525 4939 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 16:00:49 crc kubenswrapper[4939]: E0318 16:00:49.790671 4939 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2ffv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qkn72_openstack(ed545871-ed70-4d38-830a-8a6131455769): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 16:00:49 crc kubenswrapper[4939]: E0318 16:00:49.791852 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qkn72" podUID="ed545871-ed70-4d38-830a-8a6131455769" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.854454 4939 scope.go:117] "RemoveContainer" containerID="fd2ba35cca3c4a50e7b30e58a247fc501b09d30b33f3afecf7f97d78f769408b" Mar 18 16:00:49 crc kubenswrapper[4939]: I0318 16:00:49.964991 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020174 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020236 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020349 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020397 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdslh\" (UniqueName: \"kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.020493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config\") pod \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\" (UID: \"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e\") " Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.029650 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh" (OuterVolumeSpecName: "kube-api-access-vdslh") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "kube-api-access-vdslh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.031026 4939 scope.go:117] "RemoveContainer" containerID="80eb66bc6ba29abe33932e55b6b91293422afcddc11a417a80e0c01e5ab3dba3" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.071353 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config" (OuterVolumeSpecName: "config") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.074786 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.075754 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.083441 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.093288 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" (UID: "41c6ec72-2dc0-41ac-a7b8-9762244b9a9e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125752 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125777 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125786 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdslh\" (UniqueName: \"kubernetes.io/projected/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-kube-api-access-vdslh\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125794 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125803 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.125812 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.148343 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48cae53-0eb8-42aa-a730-98d9e33bcbe4" path="/var/lib/kubelet/pods/b48cae53-0eb8-42aa-a730-98d9e33bcbe4/volumes" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.213191 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerStarted","Data":"3db423421ad358c3c52cd4a61542661b8160f11149fa834c97411881e84c2a42"} Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.215895 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms4t8" event={"ID":"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86","Type":"ContainerStarted","Data":"ee5d5f5a29d0ff5499b96b3e87728787466fe48d24d60fe72721d0ff3f3c1528"} Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.219799 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" event={"ID":"41c6ec72-2dc0-41ac-a7b8-9762244b9a9e","Type":"ContainerDied","Data":"7f2a26d1322ea1c09f52b71348e8ea45a6bcb3ce76ff9faa3c90338458468e16"} Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.219857 4939 scope.go:117] "RemoveContainer" containerID="a03eccb7e9916ce5fe6ed7588687d3a05db26cb96894e7505653140de5f70298" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.219813 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-jjghp" Mar 18 16:00:50 crc kubenswrapper[4939]: E0318 16:00:50.224279 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qkn72" podUID="ed545871-ed70-4d38-830a-8a6131455769" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.232547 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ms4t8" podStartSLOduration=2.079146503 podStartE2EDuration="25.232528998s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="2026-03-18 16:00:26.658636313 +0000 UTC m=+1391.257823924" lastFinishedPulling="2026-03-18 16:00:49.812018798 +0000 UTC m=+1414.411206419" observedRunningTime="2026-03-18 16:00:50.229397466 +0000 UTC m=+1414.828585087" watchObservedRunningTime="2026-03-18 16:00:50.232528998 +0000 UTC m=+1414.831716619" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.262568 4939 scope.go:117] "RemoveContainer" containerID="dd35eeb7824c7e7eb1a9e0b90606e31e95956ab43d48d2229cafbed117b809ba" Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.292001 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.300282 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-jjghp"] Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.319581 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8brg8"] Mar 18 16:00:50 crc kubenswrapper[4939]: W0318 16:00:50.338458 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0223f3cb_46bb_4bb1_aa44_b6b259a559f5.slice/crio-e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5 WatchSource:0}: Error finding container e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5: Status 404 returned error can't find the container with id e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5 Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.433795 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:50 crc kubenswrapper[4939]: I0318 16:00:50.528626 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.250270 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerStarted","Data":"0c7fe9c4155b4831bd03228350c672b9c68571f96276cee2f3bba62cba25d888"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.250893 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerStarted","Data":"8e7914c48b54971396969a48048593f83a308cae048bd61f7abddccdb68cb989"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.252714 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerStarted","Data":"3a58bac812ff61d927ca864d82d746ab03ccf821635ee32714d7731a006b6367"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.252747 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerStarted","Data":"4bf53d636b6076656173675261a0a77c047c311b5a77976f2375168d137764c2"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.256286 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8brg8" event={"ID":"0223f3cb-46bb-4bb1-aa44-b6b259a559f5","Type":"ContainerStarted","Data":"c4f05427786ea0127178a8ea6b647e8626d443308882a093c4c79509731e6abf"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.256313 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8brg8" event={"ID":"0223f3cb-46bb-4bb1-aa44-b6b259a559f5","Type":"ContainerStarted","Data":"e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5"} Mar 18 16:00:51 crc kubenswrapper[4939]: I0318 16:00:51.282247 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8brg8" podStartSLOduration=10.282227551 podStartE2EDuration="10.282227551s" podCreationTimestamp="2026-03-18 16:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:51.270785437 +0000 UTC m=+1415.869973058" watchObservedRunningTime="2026-03-18 16:00:51.282227551 +0000 UTC m=+1415.881415172" Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.153330 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" path="/var/lib/kubelet/pods/41c6ec72-2dc0-41ac-a7b8-9762244b9a9e/volumes" Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.273037 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerStarted","Data":"be1ceb8e338959d7d82b0d9c8dbd6627d8b0cc7286da501b153090951d662b41"} Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.277248 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerStarted","Data":"63e00c85d4fcd04f40420b32206738e70a2b26b69d681fd823a84fb57bf3b4bd"} Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.290489 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerStarted","Data":"6a1240c4afc05c1dc8d7f58258c3a1656a859c201e52d3ef0f5ba0aaf40645bd"} Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.308631 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.308614569 podStartE2EDuration="3.308614569s" podCreationTimestamp="2026-03-18 16:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:52.300745133 +0000 UTC m=+1416.899932764" watchObservedRunningTime="2026-03-18 16:00:52.308614569 +0000 UTC m=+1416.907802180" Mar 18 16:00:52 crc kubenswrapper[4939]: I0318 16:00:52.331849 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.331826302 podStartE2EDuration="11.331826302s" podCreationTimestamp="2026-03-18 16:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:52.318433822 +0000 UTC m=+1416.917621443" watchObservedRunningTime="2026-03-18 16:00:52.331826302 +0000 UTC m=+1416.931013923" Mar 18 16:00:53 crc kubenswrapper[4939]: I0318 16:00:53.311982 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" containerID="ee5d5f5a29d0ff5499b96b3e87728787466fe48d24d60fe72721d0ff3f3c1528" exitCode=0 Mar 18 16:00:53 crc kubenswrapper[4939]: I0318 16:00:53.312300 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms4t8" event={"ID":"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86","Type":"ContainerDied","Data":"ee5d5f5a29d0ff5499b96b3e87728787466fe48d24d60fe72721d0ff3f3c1528"} Mar 18 16:00:53 crc kubenswrapper[4939]: I0318 16:00:53.687722 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:00:53 crc kubenswrapper[4939]: I0318 16:00:53.688390 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:00:54 crc kubenswrapper[4939]: I0318 16:00:54.322121 4939 generic.go:334] "Generic (PLEG): container finished" podID="0223f3cb-46bb-4bb1-aa44-b6b259a559f5" containerID="c4f05427786ea0127178a8ea6b647e8626d443308882a093c4c79509731e6abf" exitCode=0 Mar 18 16:00:54 crc kubenswrapper[4939]: I0318 16:00:54.322235 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8brg8" event={"ID":"0223f3cb-46bb-4bb1-aa44-b6b259a559f5","Type":"ContainerDied","Data":"c4f05427786ea0127178a8ea6b647e8626d443308882a093c4c79509731e6abf"} Mar 18 16:00:54 crc kubenswrapper[4939]: I0318 16:00:54.323603 4939 generic.go:334] "Generic (PLEG): container finished" podID="10912abd-378d-4dd0-abf1-092a5e7d7043" containerID="18de2267ef895322ef9b35671a21963af69c81f3a74b753b0ee5412e9a974231" exitCode=0 Mar 18 16:00:54 crc kubenswrapper[4939]: I0318 16:00:54.323681 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pxrk6" event={"ID":"10912abd-378d-4dd0-abf1-092a5e7d7043","Type":"ContainerDied","Data":"18de2267ef895322ef9b35671a21963af69c81f3a74b753b0ee5412e9a974231"} Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.134089 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.214588 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle\") pod \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.214702 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts\") pod \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.214808 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs\") pod \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.214955 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data\") pod \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.215007 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8wr\" (UniqueName: \"kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr\") pod \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\" (UID: \"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.215564 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs" (OuterVolumeSpecName: "logs") pod "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" (UID: "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.221036 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr" (OuterVolumeSpecName: "kube-api-access-hx8wr") pod "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" (UID: "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86"). InnerVolumeSpecName "kube-api-access-hx8wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.221459 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts" (OuterVolumeSpecName: "scripts") pod "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" (UID: "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.262042 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data" (OuterVolumeSpecName: "config-data") pod "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" (UID: "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.263973 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" (UID: "bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.316649 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.316685 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8wr\" (UniqueName: \"kubernetes.io/projected/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-kube-api-access-hx8wr\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.316700 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.316710 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.316721 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.333950 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerStarted","Data":"084364a45c6c130326221d639d69d9654204e292b1eb519e3a53f951ecd492c0"} Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.335277 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms4t8" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.335758 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms4t8" event={"ID":"bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86","Type":"ContainerDied","Data":"d1682b569ffbc9039ec9e59db7dd07e1590a1f31aeb36ad596bc6d7614ce2355"} Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.335801 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1682b569ffbc9039ec9e59db7dd07e1590a1f31aeb36ad596bc6d7614ce2355" Mar 18 16:00:55 crc kubenswrapper[4939]: E0318 16:00:55.421636 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcfa8252_dbf4_4a2a_aab3_5e0d966e5f86.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.438721 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:00:55 crc kubenswrapper[4939]: E0318 16:00:55.439227 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.439251 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" Mar 18 16:00:55 crc kubenswrapper[4939]: E0318 16:00:55.439305 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="init" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.439317 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="init" Mar 18 16:00:55 crc kubenswrapper[4939]: E0318 16:00:55.439346 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" containerName="placement-db-sync" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.439359 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" containerName="placement-db-sync" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.439708 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" containerName="placement-db-sync" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.439786 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c6ec72-2dc0-41ac-a7b8-9762244b9a9e" containerName="dnsmasq-dns" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.441943 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.450656 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.451549 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.451773 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.451928 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sgjh9" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.452059 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.466597 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623215 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623285 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623328 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623394 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623449 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rmw\" (UniqueName: \"kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.623469 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.704667 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.725941 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726101 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726170 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726233 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rmw\" (UniqueName: \"kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726254 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726705 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.726874 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.728724 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.733620 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.739577 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.740389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.741012 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.741940 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.755979 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rmw\" (UniqueName: \"kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw\") pod \"placement-5d598cf76-pjrv7\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.763159 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.784960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.828307 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6rs4\" (UniqueName: \"kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4\") pod \"10912abd-378d-4dd0-abf1-092a5e7d7043\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.828612 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle\") pod \"10912abd-378d-4dd0-abf1-092a5e7d7043\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.828698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config\") pod \"10912abd-378d-4dd0-abf1-092a5e7d7043\" (UID: \"10912abd-378d-4dd0-abf1-092a5e7d7043\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.831899 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4" (OuterVolumeSpecName: "kube-api-access-z6rs4") pod "10912abd-378d-4dd0-abf1-092a5e7d7043" (UID: "10912abd-378d-4dd0-abf1-092a5e7d7043"). InnerVolumeSpecName "kube-api-access-z6rs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.850600 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10912abd-378d-4dd0-abf1-092a5e7d7043" (UID: "10912abd-378d-4dd0-abf1-092a5e7d7043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.871539 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config" (OuterVolumeSpecName: "config") pod "10912abd-378d-4dd0-abf1-092a5e7d7043" (UID: "10912abd-378d-4dd0-abf1-092a5e7d7043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930367 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930428 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930453 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930643 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930673 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.930768 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2g5w\" (UniqueName: \"kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w\") pod \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\" (UID: \"0223f3cb-46bb-4bb1-aa44-b6b259a559f5\") " Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.931256 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6rs4\" (UniqueName: \"kubernetes.io/projected/10912abd-378d-4dd0-abf1-092a5e7d7043-kube-api-access-z6rs4\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.931274 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.931287 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/10912abd-378d-4dd0-abf1-092a5e7d7043-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.934699 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w" (OuterVolumeSpecName: "kube-api-access-s2g5w") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "kube-api-access-s2g5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.935941 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.940603 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.940627 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts" (OuterVolumeSpecName: "scripts") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.964163 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:55 crc kubenswrapper[4939]: I0318 16:00:55.976795 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data" (OuterVolumeSpecName: "config-data") pod "0223f3cb-46bb-4bb1-aa44-b6b259a559f5" (UID: "0223f3cb-46bb-4bb1-aa44-b6b259a559f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033228 4939 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033269 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033286 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033302 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033318 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.033335 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2g5w\" (UniqueName: \"kubernetes.io/projected/0223f3cb-46bb-4bb1-aa44-b6b259a559f5-kube-api-access-s2g5w\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.223915 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.343325 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerStarted","Data":"9d864febb00fc16986b1d08e31b5411fba9a9590343d630fe7c8a2c9fe30c1f2"} Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.344560 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8brg8" event={"ID":"0223f3cb-46bb-4bb1-aa44-b6b259a559f5","Type":"ContainerDied","Data":"e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5"} Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.344582 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9126676131f8a6459bea68af4e6bfc5aedb1279c354e7075f39102640ca60e5" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.344601 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8brg8" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.349269 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pxrk6" event={"ID":"10912abd-378d-4dd0-abf1-092a5e7d7043","Type":"ContainerDied","Data":"a292f0588022315ec3799772c7c47204c2d8e77d82f46c79ae7339eca7ea21c3"} Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.349306 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a292f0588022315ec3799772c7c47204c2d8e77d82f46c79ae7339eca7ea21c3" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.349383 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pxrk6" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.511630 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:00:56 crc kubenswrapper[4939]: E0318 16:00:56.512309 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0223f3cb-46bb-4bb1-aa44-b6b259a559f5" containerName="keystone-bootstrap" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.512324 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0223f3cb-46bb-4bb1-aa44-b6b259a559f5" containerName="keystone-bootstrap" Mar 18 16:00:56 crc kubenswrapper[4939]: E0318 16:00:56.512338 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10912abd-378d-4dd0-abf1-092a5e7d7043" containerName="neutron-db-sync" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.512343 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="10912abd-378d-4dd0-abf1-092a5e7d7043" containerName="neutron-db-sync" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.512522 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0223f3cb-46bb-4bb1-aa44-b6b259a559f5" containerName="keystone-bootstrap" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.514821 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="10912abd-378d-4dd0-abf1-092a5e7d7043" containerName="neutron-db-sync" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.515759 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523195 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523429 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5prk2" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523546 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523484 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523745 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.523898 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.534841 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.560703 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.562225 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.607858 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.647789 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.649578 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652759 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652793 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652820 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652844 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652861 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72m89\" (UniqueName: \"kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652899 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.652924 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.655167 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.655383 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2vzwd" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.655689 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.655818 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.655852 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754207 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754255 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754309 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwm5\" (UniqueName: \"kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754331 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754350 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754379 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754397 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754419 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754435 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754465 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754483 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vx6\" (UniqueName: \"kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754516 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754534 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754558 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754576 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754593 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72m89\" (UniqueName: \"kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.754618 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.759219 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.759798 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.761713 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.762757 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.762981 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.763837 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.763940 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:56 crc kubenswrapper[4939]: I0318 16:00:56.780725 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72m89\" (UniqueName: \"kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89\") pod \"keystone-56c996c794-vrkm4\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.052855 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwm5\" (UniqueName: \"kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.052926 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.052946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.052969 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053001 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053014 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053035 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vx6\" (UniqueName: \"kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053053 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053079 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053108 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053149 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.053174 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.060081 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.061095 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.065287 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.066042 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.069472 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.072333 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.074830 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.075402 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.077535 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwm5\" (UniqueName: \"kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5\") pod \"dnsmasq-dns-55f844cf75-8shzv\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.078344 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.090394 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vx6\" (UniqueName: \"kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6\") pod \"neutron-7dcffbdd64-d9zm4\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.164497 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.167121 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.189713 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.196840 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.263821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.266349 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.266452 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.266575 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vngvt\" (UniqueName: \"kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.266703 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.281573 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.369029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vngvt\" (UniqueName: \"kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.369388 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.369459 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.369559 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.369607 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.375537 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.377636 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.390006 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.391116 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.393466 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerStarted","Data":"db9387db45857f14d1bc75444606939dc75c93442d1661ce4ad987885fad4deb"} Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.393541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerStarted","Data":"158535dbdc0da716361bdda232739bce8feebc1b0c21fc44eafb03d99278d41a"} Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.396094 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.396128 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.400836 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vngvt\" (UniqueName: \"kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt\") pod \"neutron-b8f86c468-qlp5n\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.403901 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.409718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qvp84" event={"ID":"0ca6e14d-75ec-40af-9670-c413af1391df","Type":"ContainerStarted","Data":"08f1488f1748f3e79870960db845efd33610723a7e30d14a11fe46d3bd8c9767"} Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.436878 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5d598cf76-pjrv7" podStartSLOduration=2.436858993 podStartE2EDuration="2.436858993s" podCreationTimestamp="2026-03-18 16:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:57.414805146 +0000 UTC m=+1422.013992767" watchObservedRunningTime="2026-03-18 16:00:57.436858993 +0000 UTC m=+1422.036046614" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.437756 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-qvp84" podStartSLOduration=2.343087362 podStartE2EDuration="32.437749905s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="2026-03-18 16:00:26.554720692 +0000 UTC m=+1391.153908313" lastFinishedPulling="2026-03-18 16:00:56.649383235 +0000 UTC m=+1421.248570856" observedRunningTime="2026-03-18 16:00:57.435085629 +0000 UTC m=+1422.034273250" watchObservedRunningTime="2026-03-18 16:00:57.437749905 +0000 UTC m=+1422.036937526" Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.666272 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:00:57 crc kubenswrapper[4939]: W0318 16:00:57.682652 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod878180f2_988b_4d66_aaf0_3429900f5e77.slice/crio-449ec1eecbccac7f81aed70980a52ef6a60a1c555743f4e771718d4a392cead8 WatchSource:0}: Error finding container 449ec1eecbccac7f81aed70980a52ef6a60a1c555743f4e771718d4a392cead8: Status 404 returned error can't find the container with id 449ec1eecbccac7f81aed70980a52ef6a60a1c555743f4e771718d4a392cead8 Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.907135 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:00:57 crc kubenswrapper[4939]: I0318 16:00:57.927855 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.116143 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.439871 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerStarted","Data":"1352f04be604c8cd3fdc9b3e4b2d58820675611e8951fd73ba5ae7fd614d71dd"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.439936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerStarted","Data":"a5788169ea06823ef4783a0fc98462e542e152b3de83f941de58cbfcf2e5012d"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.444251 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerStarted","Data":"9c838307b1924d5cb4ac8f001bd93aef896f2657e89d2dea39c4f605b31ef322"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.446764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c996c794-vrkm4" event={"ID":"878180f2-988b-4d66-aaf0-3429900f5e77","Type":"ContainerStarted","Data":"add0a3162f0fb4bc567ed074ad66216e38747696a6b5e808eb5932b5e0024e79"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.446799 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c996c794-vrkm4" event={"ID":"878180f2-988b-4d66-aaf0-3429900f5e77","Type":"ContainerStarted","Data":"449ec1eecbccac7f81aed70980a52ef6a60a1c555743f4e771718d4a392cead8"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.447642 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.456911 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerStarted","Data":"623d50fc40a8298c4707b45313dcbc936441c2236b9dad676567394d2a7fe1ff"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.457001 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerStarted","Data":"0ed0c36e8bd9a5464868000f1c1cdf5e76284a717c0c9165495ff444adc10da4"} Mar 18 16:00:58 crc kubenswrapper[4939]: I0318 16:00:58.476711 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56c996c794-vrkm4" podStartSLOduration=2.476693583 podStartE2EDuration="2.476693583s" podCreationTimestamp="2026-03-18 16:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:58.467380428 +0000 UTC m=+1423.066568069" watchObservedRunningTime="2026-03-18 16:00:58.476693583 +0000 UTC m=+1423.075881204" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.480178 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerStarted","Data":"bc597c857f5f728e5137606cedc139017a2b4a9cbbad872803bf70dda3d4ea6f"} Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.480764 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.508336 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerStarted","Data":"75cb3253f0a799e23784270d1f152480e5a674daa02ecb8e6df7c31fc17d5e37"} Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.511705 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dcffbdd64-d9zm4" podStartSLOduration=3.5116872470000002 podStartE2EDuration="3.511687247s" podCreationTimestamp="2026-03-18 16:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:59.507797415 +0000 UTC m=+1424.106985036" watchObservedRunningTime="2026-03-18 16:00:59.511687247 +0000 UTC m=+1424.110874868" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.512256 4939 generic.go:334] "Generic (PLEG): container finished" podID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerID="623d50fc40a8298c4707b45313dcbc936441c2236b9dad676567394d2a7fe1ff" exitCode=0 Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.513907 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerDied","Data":"623d50fc40a8298c4707b45313dcbc936441c2236b9dad676567394d2a7fe1ff"} Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.599772 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.599810 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.646575 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:59 crc kubenswrapper[4939]: I0318 16:00:59.665952 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.151164 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564161-466kv"] Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.152423 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.168298 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.187693 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-466kv"] Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.211591 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.213179 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.219154 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.219683 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.223800 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295403 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295454 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295575 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295600 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295645 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295675 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncrh\" (UniqueName: \"kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295692 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkz4\" (UniqueName: \"kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295708 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295735 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295776 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.295794 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.398318 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.398921 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncrh\" (UniqueName: \"kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.398984 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkz4\" (UniqueName: \"kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399012 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399057 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399132 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399153 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399198 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399241 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399302 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.399337 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.405864 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.405866 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.408120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.408998 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.411050 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.414995 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.415093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.418619 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.421039 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkz4\" (UniqueName: \"kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.421437 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncrh\" (UniqueName: \"kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh\") pod \"neutron-5d7467855-ml675\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.421704 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys\") pod \"keystone-cron-29564161-466kv\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.471554 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.532148 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerStarted","Data":"b23c9f2c3b2d19c67b54836359307e59dd6daff6f0021c175f1c13875fd01dc3"} Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.532827 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.533431 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.544908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerStarted","Data":"8a901e789f36af56195efb10612dc4cf8cb42d65c23e4bf1d33387c4f7fe29e2"} Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.546170 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.546233 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.547127 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.592785 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b8f86c468-qlp5n" podStartSLOduration=3.592759583 podStartE2EDuration="3.592759583s" podCreationTimestamp="2026-03-18 16:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:00.567952179 +0000 UTC m=+1425.167139820" watchObservedRunningTime="2026-03-18 16:01:00.592759583 +0000 UTC m=+1425.191947204" Mar 18 16:01:00 crc kubenswrapper[4939]: I0318 16:01:00.593283 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" podStartSLOduration=4.59327596 podStartE2EDuration="4.59327596s" podCreationTimestamp="2026-03-18 16:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:00.586881764 +0000 UTC m=+1425.186069395" watchObservedRunningTime="2026-03-18 16:01:00.59327596 +0000 UTC m=+1425.192463581" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.150316 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-466kv"] Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.224876 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.538824 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.538886 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.555186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-466kv" event={"ID":"834aca75-038f-4aed-8d55-bb1924b96934","Type":"ContainerStarted","Data":"6b8e2499c7541a0f9cf78895b03b9389a427404109be620d47865829c7a89f85"} Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.555246 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-466kv" event={"ID":"834aca75-038f-4aed-8d55-bb1924b96934","Type":"ContainerStarted","Data":"25957d8b8f302c29d1568078ca2b615bd685d10c0f79f19f7878386bc7207f23"} Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.563767 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerStarted","Data":"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2"} Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.564075 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerStarted","Data":"4be05dd2b01488168768f6e64a76d246c025e9debcc6432323ddeee41a5cbd61"} Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.564630 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dcffbdd64-d9zm4" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-httpd" containerID="cri-o://bc597c857f5f728e5137606cedc139017a2b4a9cbbad872803bf70dda3d4ea6f" gracePeriod=30 Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.564305 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dcffbdd64-d9zm4" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-api" containerID="cri-o://1352f04be604c8cd3fdc9b3e4b2d58820675611e8951fd73ba5ae7fd614d71dd" gracePeriod=30 Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.577152 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564161-466kv" podStartSLOduration=1.5771278180000001 podStartE2EDuration="1.577127818s" podCreationTimestamp="2026-03-18 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:01.572964165 +0000 UTC m=+1426.172151806" watchObservedRunningTime="2026-03-18 16:01:01.577127818 +0000 UTC m=+1426.176315439" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.582910 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.584262 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:01:01 crc kubenswrapper[4939]: I0318 16:01:01.626032 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:01:02 crc kubenswrapper[4939]: I0318 16:01:02.575398 4939 generic.go:334] "Generic (PLEG): container finished" podID="7859058e-a736-4065-bb79-8be528d5a709" containerID="bc597c857f5f728e5137606cedc139017a2b4a9cbbad872803bf70dda3d4ea6f" exitCode=0 Mar 18 16:01:02 crc kubenswrapper[4939]: I0318 16:01:02.575466 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerDied","Data":"bc597c857f5f728e5137606cedc139017a2b4a9cbbad872803bf70dda3d4ea6f"} Mar 18 16:01:02 crc kubenswrapper[4939]: I0318 16:01:02.576407 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.420766 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.420893 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.427576 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.583731 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.834153 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:01:03 crc kubenswrapper[4939]: I0318 16:01:03.836493 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:01:04 crc kubenswrapper[4939]: I0318 16:01:04.599995 4939 generic.go:334] "Generic (PLEG): container finished" podID="834aca75-038f-4aed-8d55-bb1924b96934" containerID="6b8e2499c7541a0f9cf78895b03b9389a427404109be620d47865829c7a89f85" exitCode=0 Mar 18 16:01:04 crc kubenswrapper[4939]: I0318 16:01:04.600083 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-466kv" event={"ID":"834aca75-038f-4aed-8d55-bb1924b96934","Type":"ContainerDied","Data":"6b8e2499c7541a0f9cf78895b03b9389a427404109be620d47865829c7a89f85"} Mar 18 16:01:05 crc kubenswrapper[4939]: I0318 16:01:05.620598 4939 generic.go:334] "Generic (PLEG): container finished" podID="0ca6e14d-75ec-40af-9670-c413af1391df" containerID="08f1488f1748f3e79870960db845efd33610723a7e30d14a11fe46d3bd8c9767" exitCode=0 Mar 18 16:01:05 crc kubenswrapper[4939]: I0318 16:01:05.621066 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qvp84" event={"ID":"0ca6e14d-75ec-40af-9670-c413af1391df","Type":"ContainerDied","Data":"08f1488f1748f3e79870960db845efd33610723a7e30d14a11fe46d3bd8c9767"} Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.322972 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.365467 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxkz4\" (UniqueName: \"kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4\") pod \"834aca75-038f-4aed-8d55-bb1924b96934\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.365570 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data\") pod \"834aca75-038f-4aed-8d55-bb1924b96934\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.365632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle\") pod \"834aca75-038f-4aed-8d55-bb1924b96934\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.365693 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys\") pod \"834aca75-038f-4aed-8d55-bb1924b96934\" (UID: \"834aca75-038f-4aed-8d55-bb1924b96934\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.372462 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4" (OuterVolumeSpecName: "kube-api-access-nxkz4") pod "834aca75-038f-4aed-8d55-bb1924b96934" (UID: "834aca75-038f-4aed-8d55-bb1924b96934"). InnerVolumeSpecName "kube-api-access-nxkz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.377701 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "834aca75-038f-4aed-8d55-bb1924b96934" (UID: "834aca75-038f-4aed-8d55-bb1924b96934"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.405709 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "834aca75-038f-4aed-8d55-bb1924b96934" (UID: "834aca75-038f-4aed-8d55-bb1924b96934"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.432586 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data" (OuterVolumeSpecName: "config-data") pod "834aca75-038f-4aed-8d55-bb1924b96934" (UID: "834aca75-038f-4aed-8d55-bb1924b96934"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.467690 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.467731 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.467742 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxkz4\" (UniqueName: \"kubernetes.io/projected/834aca75-038f-4aed-8d55-bb1924b96934-kube-api-access-nxkz4\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.467753 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834aca75-038f-4aed-8d55-bb1924b96934-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.634224 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-466kv" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.640053 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-466kv" event={"ID":"834aca75-038f-4aed-8d55-bb1924b96934","Type":"ContainerDied","Data":"25957d8b8f302c29d1568078ca2b615bd685d10c0f79f19f7878386bc7207f23"} Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.640178 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25957d8b8f302c29d1568078ca2b615bd685d10c0f79f19f7878386bc7207f23" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.906929 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qvp84" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.979019 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zglw8\" (UniqueName: \"kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8\") pod \"0ca6e14d-75ec-40af-9670-c413af1391df\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.979069 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle\") pod \"0ca6e14d-75ec-40af-9670-c413af1391df\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.979102 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data\") pod \"0ca6e14d-75ec-40af-9670-c413af1391df\" (UID: \"0ca6e14d-75ec-40af-9670-c413af1391df\") " Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.983778 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ca6e14d-75ec-40af-9670-c413af1391df" (UID: "0ca6e14d-75ec-40af-9670-c413af1391df"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:06 crc kubenswrapper[4939]: I0318 16:01:06.983915 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8" (OuterVolumeSpecName: "kube-api-access-zglw8") pod "0ca6e14d-75ec-40af-9670-c413af1391df" (UID: "0ca6e14d-75ec-40af-9670-c413af1391df"). InnerVolumeSpecName "kube-api-access-zglw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.006576 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ca6e14d-75ec-40af-9670-c413af1391df" (UID: "0ca6e14d-75ec-40af-9670-c413af1391df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.082018 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zglw8\" (UniqueName: \"kubernetes.io/projected/0ca6e14d-75ec-40af-9670-c413af1391df-kube-api-access-zglw8\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.082068 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.082082 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ca6e14d-75ec-40af-9670-c413af1391df-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.199171 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.281455 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.282865 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="dnsmasq-dns" containerID="cri-o://bab1bdac7d712472395e845130fa5c2396afaa42a2d98b888bbd0dcfa6ba492d" gracePeriod=10 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.657233 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qvp84" event={"ID":"0ca6e14d-75ec-40af-9670-c413af1391df","Type":"ContainerDied","Data":"067367095e2219c250ba9054b67c161991e660a981b9519d856d27fe734f79ac"} Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.657692 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067367095e2219c250ba9054b67c161991e660a981b9519d856d27fe734f79ac" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.657755 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qvp84" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.661753 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkn72" event={"ID":"ed545871-ed70-4d38-830a-8a6131455769","Type":"ContainerStarted","Data":"703e1a3bacf7c889b4eaa030926ad758deed0401540052152182a6de21a167bc"} Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.673665 4939 generic.go:334] "Generic (PLEG): container finished" podID="2695722d-fc7c-4965-ae10-f20e016051d3" containerID="bab1bdac7d712472395e845130fa5c2396afaa42a2d98b888bbd0dcfa6ba492d" exitCode=0 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.673757 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" event={"ID":"2695722d-fc7c-4965-ae10-f20e016051d3","Type":"ContainerDied","Data":"bab1bdac7d712472395e845130fa5c2396afaa42a2d98b888bbd0dcfa6ba492d"} Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685350 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerStarted","Data":"31fe77a3dc7834d62a3db951e231c6a4d499f91b668eada5ae33113327064985"} Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685534 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-central-agent" containerID="cri-o://3db423421ad358c3c52cd4a61542661b8160f11149fa834c97411881e84c2a42" gracePeriod=30 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685610 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685627 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="proxy-httpd" containerID="cri-o://31fe77a3dc7834d62a3db951e231c6a4d499f91b668eada5ae33113327064985" gracePeriod=30 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685678 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-notification-agent" containerID="cri-o://be1ceb8e338959d7d82b0d9c8dbd6627d8b0cc7286da501b153090951d662b41" gracePeriod=30 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.685795 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="sg-core" containerID="cri-o://084364a45c6c130326221d639d69d9654204e292b1eb519e3a53f951ecd492c0" gracePeriod=30 Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.700887 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qkn72" podStartSLOduration=2.576210009 podStartE2EDuration="42.700862185s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="2026-03-18 16:00:26.657428627 +0000 UTC m=+1391.256616248" lastFinishedPulling="2026-03-18 16:01:06.782080803 +0000 UTC m=+1431.381268424" observedRunningTime="2026-03-18 16:01:07.696957104 +0000 UTC m=+1432.296144725" watchObservedRunningTime="2026-03-18 16:01:07.700862185 +0000 UTC m=+1432.300049796" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.705898 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerStarted","Data":"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa"} Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.706776 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.732831 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.261298705 podStartE2EDuration="42.732812769s" podCreationTimestamp="2026-03-18 16:00:25 +0000 UTC" firstStartedPulling="2026-03-18 16:00:26.369715178 +0000 UTC m=+1390.968902799" lastFinishedPulling="2026-03-18 16:01:06.841229242 +0000 UTC m=+1431.440416863" observedRunningTime="2026-03-18 16:01:07.720487401 +0000 UTC m=+1432.319675022" watchObservedRunningTime="2026-03-18 16:01:07.732812769 +0000 UTC m=+1432.332000390" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.770419 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d7467855-ml675" podStartSLOduration=7.770401867 podStartE2EDuration="7.770401867s" podCreationTimestamp="2026-03-18 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:07.759195538 +0000 UTC m=+1432.358383159" watchObservedRunningTime="2026-03-18 16:01:07.770401867 +0000 UTC m=+1432.369589488" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.919326 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:01:07 crc kubenswrapper[4939]: E0318 16:01:07.920724 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" containerName="barbican-db-sync" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.920739 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" containerName="barbican-db-sync" Mar 18 16:01:07 crc kubenswrapper[4939]: E0318 16:01:07.920750 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834aca75-038f-4aed-8d55-bb1924b96934" containerName="keystone-cron" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.920756 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="834aca75-038f-4aed-8d55-bb1924b96934" containerName="keystone-cron" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.920957 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" containerName="barbican-db-sync" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.920976 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="834aca75-038f-4aed-8d55-bb1924b96934" containerName="keystone-cron" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.981788 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.984834 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-98jnl" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.985078 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 16:01:07 crc kubenswrapper[4939]: I0318 16:01:07.986076 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.007219 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.009806 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.018030 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.049728 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076335 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076446 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q26g9\" (UniqueName: \"kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076472 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076577 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585cb\" (UniqueName: \"kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076611 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076633 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076677 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076733 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076757 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.076773 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.087713 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.111576 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.113102 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.121381 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.172693 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178116 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178191 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178281 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178332 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96hf2\" (UniqueName: \"kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178363 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178419 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0\") pod \"2695722d-fc7c-4965-ae10-f20e016051d3\" (UID: \"2695722d-fc7c-4965-ae10-f20e016051d3\") " Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178642 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178677 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178722 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178742 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178761 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178779 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178808 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178836 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178854 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178872 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q26g9\" (UniqueName: \"kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178890 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178921 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178972 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-585cb\" (UniqueName: \"kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.178995 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.179014 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.179037 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqfz\" (UniqueName: \"kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.191225 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.192054 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.192268 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.192531 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.200077 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2" (OuterVolumeSpecName: "kube-api-access-96hf2") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "kube-api-access-96hf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.201100 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.210006 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q26g9\" (UniqueName: \"kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.210458 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.211152 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.213596 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle\") pod \"barbican-worker-5b4578f6d7-lcqvz\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.229097 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-585cb\" (UniqueName: \"kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb\") pod \"barbican-keystone-listener-6659dc68fd-4444w\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.229164 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:08 crc kubenswrapper[4939]: E0318 16:01:08.229555 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="init" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.229577 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="init" Mar 18 16:01:08 crc kubenswrapper[4939]: E0318 16:01:08.229593 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="dnsmasq-dns" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.229600 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="dnsmasq-dns" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.229770 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" containerName="dnsmasq-dns" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.230625 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.236981 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.254303 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283680 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283743 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283777 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283848 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjrr\" (UniqueName: \"kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283935 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqfz\" (UniqueName: \"kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.283966 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284055 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284073 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284157 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284183 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284215 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.284333 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96hf2\" (UniqueName: \"kubernetes.io/projected/2695722d-fc7c-4965-ae10-f20e016051d3-kube-api-access-96hf2\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.285199 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.287213 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.287933 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.288133 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.288563 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.288806 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.334631 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqfz\" (UniqueName: \"kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz\") pod \"dnsmasq-dns-85ff748b95-s2qlq\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.339526 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.352011 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.365018 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config" (OuterVolumeSpecName: "config") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.372492 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2695722d-fc7c-4965-ae10-f20e016051d3" (UID: "2695722d-fc7c-4965-ae10-f20e016051d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.377077 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.388421 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.389105 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.389230 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.389341 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.389551 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjrr\" (UniqueName: \"kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.390132 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.390278 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.390342 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.390404 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.390457 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2695722d-fc7c-4965-ae10-f20e016051d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.394342 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.397982 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.402114 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.407811 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.413184 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjrr\" (UniqueName: \"kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr\") pod \"barbican-api-665d48dd94-86cx7\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.421622 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.456684 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.589168 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.738744 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" event={"ID":"2695722d-fc7c-4965-ae10-f20e016051d3","Type":"ContainerDied","Data":"0bbbdf779f5a4b3cbd40fccdfd67b1e473397c9c78c9b427644367e365eecaf7"} Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.738780 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kx7lw" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.738794 4939 scope.go:117] "RemoveContainer" containerID="bab1bdac7d712472395e845130fa5c2396afaa42a2d98b888bbd0dcfa6ba492d" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759672 4939 generic.go:334] "Generic (PLEG): container finished" podID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerID="31fe77a3dc7834d62a3db951e231c6a4d499f91b668eada5ae33113327064985" exitCode=0 Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759697 4939 generic.go:334] "Generic (PLEG): container finished" podID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerID="084364a45c6c130326221d639d69d9654204e292b1eb519e3a53f951ecd492c0" exitCode=2 Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759706 4939 generic.go:334] "Generic (PLEG): container finished" podID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerID="3db423421ad358c3c52cd4a61542661b8160f11149fa834c97411881e84c2a42" exitCode=0 Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759739 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerDied","Data":"31fe77a3dc7834d62a3db951e231c6a4d499f91b668eada5ae33113327064985"} Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759763 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerDied","Data":"084364a45c6c130326221d639d69d9654204e292b1eb519e3a53f951ecd492c0"} Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.759774 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerDied","Data":"3db423421ad358c3c52cd4a61542661b8160f11149fa834c97411881e84c2a42"} Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.776438 4939 scope.go:117] "RemoveContainer" containerID="fac5b8602469e3a90d113bb1fb4bdc434c1e3ba2c6a1cd7281dc6f7d8c39ae8d" Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.785613 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.802788 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kx7lw"] Mar 18 16:01:08 crc kubenswrapper[4939]: I0318 16:01:08.914260 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:01:09 crc kubenswrapper[4939]: W0318 16:01:09.023485 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86474b5e_6fc8_4810_a083_699878062ade.slice/crio-8587f5770eb542c99226063dfc98d2ea099eeea1e39c26e3cdce054d48eb7293 WatchSource:0}: Error finding container 8587f5770eb542c99226063dfc98d2ea099eeea1e39c26e3cdce054d48eb7293: Status 404 returned error can't find the container with id 8587f5770eb542c99226063dfc98d2ea099eeea1e39c26e3cdce054d48eb7293 Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.024414 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.143686 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:09 crc kubenswrapper[4939]: W0318 16:01:09.145413 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode733f9e3_4be0_4265_9404_f88366733500.slice/crio-8cd0c7eb7e5448d55678f8b3b8c60a3d3a3f208888492eef1dcdc9b44dec3f84 WatchSource:0}: Error finding container 8cd0c7eb7e5448d55678f8b3b8c60a3d3a3f208888492eef1dcdc9b44dec3f84: Status 404 returned error can't find the container with id 8cd0c7eb7e5448d55678f8b3b8c60a3d3a3f208888492eef1dcdc9b44dec3f84 Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.153585 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.791929 4939 generic.go:334] "Generic (PLEG): container finished" podID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerID="be1ceb8e338959d7d82b0d9c8dbd6627d8b0cc7286da501b153090951d662b41" exitCode=0 Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.792299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerDied","Data":"be1ceb8e338959d7d82b0d9c8dbd6627d8b0cc7286da501b153090951d662b41"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.797877 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerStarted","Data":"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.797917 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerStarted","Data":"8cd0c7eb7e5448d55678f8b3b8c60a3d3a3f208888492eef1dcdc9b44dec3f84"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.805340 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerStarted","Data":"f78e7cc886f047f7aedc44acf77b3d8c4ca77f8eb5d449ce7bb2f8e9d7d8e229"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.809235 4939 generic.go:334] "Generic (PLEG): container finished" podID="29a706be-274f-404c-bbca-df92c297d83b" containerID="3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23" exitCode=0 Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.809283 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" event={"ID":"29a706be-274f-404c-bbca-df92c297d83b","Type":"ContainerDied","Data":"3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.809299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" event={"ID":"29a706be-274f-404c-bbca-df92c297d83b","Type":"ContainerStarted","Data":"d2c879f7a450848817f6a1fd3ff51b1914200f0c989c6e82a30663c66a3b6127"} Mar 18 16:01:09 crc kubenswrapper[4939]: I0318 16:01:09.823711 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerStarted","Data":"8587f5770eb542c99226063dfc98d2ea099eeea1e39c26e3cdce054d48eb7293"} Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.148105 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2695722d-fc7c-4965-ae10-f20e016051d3" path="/var/lib/kubelet/pods/2695722d-fc7c-4965-ae10-f20e016051d3/volumes" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.577807 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690453 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690590 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690670 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690724 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690746 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.690836 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9zhl\" (UniqueName: \"kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl\") pod \"ea13ebc5-52f6-4371-9261-92ebd07f0663\" (UID: \"ea13ebc5-52f6-4371-9261-92ebd07f0663\") " Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.691654 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.692076 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.697686 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts" (OuterVolumeSpecName: "scripts") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.715726 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl" (OuterVolumeSpecName: "kube-api-access-j9zhl") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "kube-api-access-j9zhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.736570 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767048 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:01:10 crc kubenswrapper[4939]: E0318 16:01:10.767452 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-central-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767468 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-central-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: E0318 16:01:10.767480 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="sg-core" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767487 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="sg-core" Mar 18 16:01:10 crc kubenswrapper[4939]: E0318 16:01:10.767499 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="proxy-httpd" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767523 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="proxy-httpd" Mar 18 16:01:10 crc kubenswrapper[4939]: E0318 16:01:10.767537 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-notification-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767544 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-notification-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767698 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="proxy-httpd" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767714 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-notification-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767724 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="sg-core" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.767734 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" containerName="ceilometer-central-agent" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.768836 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.770935 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.771183 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.794635 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.794671 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.794685 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea13ebc5-52f6-4371-9261-92ebd07f0663-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.794695 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9zhl\" (UniqueName: \"kubernetes.io/projected/ea13ebc5-52f6-4371-9261-92ebd07f0663-kube-api-access-j9zhl\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.794707 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.804613 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.853050 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.858797 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerStarted","Data":"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e"} Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.859663 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.859697 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.861945 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" event={"ID":"29a706be-274f-404c-bbca-df92c297d83b","Type":"ContainerStarted","Data":"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a"} Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.862328 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.866748 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea13ebc5-52f6-4371-9261-92ebd07f0663","Type":"ContainerDied","Data":"e3a9ed5e21e7fc313dbcb5d5cb0946fa68c34884d2b5d6b093e1e03958b33699"} Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.866790 4939 scope.go:117] "RemoveContainer" containerID="31fe77a3dc7834d62a3db951e231c6a4d499f91b668eada5ae33113327064985" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.866921 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.872071 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data" (OuterVolumeSpecName: "config-data") pod "ea13ebc5-52f6-4371-9261-92ebd07f0663" (UID: "ea13ebc5-52f6-4371-9261-92ebd07f0663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898155 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898216 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-456r8\" (UniqueName: \"kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898276 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898309 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898344 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898383 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898415 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898478 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.898492 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea13ebc5-52f6-4371-9261-92ebd07f0663-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.915475 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" podStartSLOduration=3.915451174 podStartE2EDuration="3.915451174s" podCreationTimestamp="2026-03-18 16:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:10.908781261 +0000 UTC m=+1435.507968902" watchObservedRunningTime="2026-03-18 16:01:10.915451174 +0000 UTC m=+1435.514638795" Mar 18 16:01:10 crc kubenswrapper[4939]: I0318 16:01:10.916425 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-665d48dd94-86cx7" podStartSLOduration=2.916418802 podStartE2EDuration="2.916418802s" podCreationTimestamp="2026-03-18 16:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:10.883117168 +0000 UTC m=+1435.482304789" watchObservedRunningTime="2026-03-18 16:01:10.916418802 +0000 UTC m=+1435.515606423" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000444 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000526 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-456r8\" (UniqueName: \"kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000611 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000644 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000692 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000738 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.000768 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.001945 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.004091 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.004680 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.005100 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.006783 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.007631 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.020357 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-456r8\" (UniqueName: \"kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8\") pod \"barbican-api-74465b498-l8mz2\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.119986 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.201382 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.214887 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.254257 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.256937 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.260587 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.264486 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.265645 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406709 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzjn4\" (UniqueName: \"kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406755 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406795 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406828 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406853 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406885 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.406937 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.508902 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzjn4\" (UniqueName: \"kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.508941 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.508979 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.509006 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.509027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.509056 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.509108 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.510128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.510303 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.514698 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.516310 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.517195 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.518352 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.524102 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzjn4\" (UniqueName: \"kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4\") pod \"ceilometer-0\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.659969 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.912442 4939 scope.go:117] "RemoveContainer" containerID="084364a45c6c130326221d639d69d9654204e292b1eb519e3a53f951ecd492c0" Mar 18 16:01:11 crc kubenswrapper[4939]: I0318 16:01:11.978063 4939 scope.go:117] "RemoveContainer" containerID="be1ceb8e338959d7d82b0d9c8dbd6627d8b0cc7286da501b153090951d662b41" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.117151 4939 scope.go:117] "RemoveContainer" containerID="3db423421ad358c3c52cd4a61542661b8160f11149fa834c97411881e84c2a42" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.172838 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea13ebc5-52f6-4371-9261-92ebd07f0663" path="/var/lib/kubelet/pods/ea13ebc5-52f6-4371-9261-92ebd07f0663/volumes" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.480767 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.489696 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:01:12 crc kubenswrapper[4939]: W0318 16:01:12.498138 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc915f86b_8ed2_4d9c_8185_4f9a2a5f5f69.slice/crio-5a50f2dc74864b787341c8d901c66456f19a5a1a625cc1c7fd73ab333113cefe WatchSource:0}: Error finding container 5a50f2dc74864b787341c8d901c66456f19a5a1a625cc1c7fd73ab333113cefe: Status 404 returned error can't find the container with id 5a50f2dc74864b787341c8d901c66456f19a5a1a625cc1c7fd73ab333113cefe Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.894019 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerStarted","Data":"c14895b390882b2c1923b6999a196557393e0c01f808b080026c9ef7f7e1d1fd"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.894071 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerStarted","Data":"c8f94483b5054ecb13ba070dc607cd5aefe23c9ba71716aa8ae0e310aed70d7c"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.894087 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerStarted","Data":"c76fec3e2cccd03cb051ff2225ce195ca2f461423faa05bec5aa9aa4f34355e1"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.894247 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.894450 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.898915 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerStarted","Data":"e02a12f9caf7aff3ab019570b585d66908b0f4cfe93a1f0a694c9e99cffe1bfe"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.898957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerStarted","Data":"e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.900616 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerStarted","Data":"5a50f2dc74864b787341c8d901c66456f19a5a1a625cc1c7fd73ab333113cefe"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.902637 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerStarted","Data":"50ab95e7c021b4416c51b15d476a07a0ef51bd8ab440f9053107700d4fa85ea2"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.902829 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerStarted","Data":"944a706537fc625973ddcc25ce21c69f44e3b1c0a43d6217239bde407bac3b36"} Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.927240 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-74465b498-l8mz2" podStartSLOduration=2.927221859 podStartE2EDuration="2.927221859s" podCreationTimestamp="2026-03-18 16:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:12.910801889 +0000 UTC m=+1437.509989510" watchObservedRunningTime="2026-03-18 16:01:12.927221859 +0000 UTC m=+1437.526409490" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.953843 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" podStartSLOduration=2.861596191 podStartE2EDuration="5.95382441s" podCreationTimestamp="2026-03-18 16:01:07 +0000 UTC" firstStartedPulling="2026-03-18 16:01:08.913004673 +0000 UTC m=+1433.512192294" lastFinishedPulling="2026-03-18 16:01:12.005232892 +0000 UTC m=+1436.604420513" observedRunningTime="2026-03-18 16:01:12.938012425 +0000 UTC m=+1437.537200066" watchObservedRunningTime="2026-03-18 16:01:12.95382441 +0000 UTC m=+1437.553012031" Mar 18 16:01:12 crc kubenswrapper[4939]: I0318 16:01:12.969880 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" podStartSLOduration=2.991336349 podStartE2EDuration="5.969865827s" podCreationTimestamp="2026-03-18 16:01:07 +0000 UTC" firstStartedPulling="2026-03-18 16:01:09.025733627 +0000 UTC m=+1433.624921248" lastFinishedPulling="2026-03-18 16:01:12.004263105 +0000 UTC m=+1436.603450726" observedRunningTime="2026-03-18 16:01:12.958750099 +0000 UTC m=+1437.557937720" watchObservedRunningTime="2026-03-18 16:01:12.969865827 +0000 UTC m=+1437.569053448" Mar 18 16:01:13 crc kubenswrapper[4939]: I0318 16:01:13.955334 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerStarted","Data":"2e3d2425aeb8f6c2bb47c43db4623498bf75e4e1e20f1c25eba67d5a53834933"} Mar 18 16:01:13 crc kubenswrapper[4939]: I0318 16:01:13.963245 4939 generic.go:334] "Generic (PLEG): container finished" podID="ed545871-ed70-4d38-830a-8a6131455769" containerID="703e1a3bacf7c889b4eaa030926ad758deed0401540052152182a6de21a167bc" exitCode=0 Mar 18 16:01:13 crc kubenswrapper[4939]: I0318 16:01:13.963328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkn72" event={"ID":"ed545871-ed70-4d38-830a-8a6131455769","Type":"ContainerDied","Data":"703e1a3bacf7c889b4eaa030926ad758deed0401540052152182a6de21a167bc"} Mar 18 16:01:14 crc kubenswrapper[4939]: I0318 16:01:14.977664 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerStarted","Data":"ebaf37379702245950f544a2b4c16f11d1d7778ab4ef76c5cd3a1dff9069948b"} Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.360820 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkn72" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.495977 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.496033 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.496110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.496164 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.496211 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2ffv\" (UniqueName: \"kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.496250 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data\") pod \"ed545871-ed70-4d38-830a-8a6131455769\" (UID: \"ed545871-ed70-4d38-830a-8a6131455769\") " Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.497285 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.504028 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts" (OuterVolumeSpecName: "scripts") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.504082 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv" (OuterVolumeSpecName: "kube-api-access-k2ffv") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "kube-api-access-k2ffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.516375 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.539197 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.562369 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data" (OuterVolumeSpecName: "config-data") pod "ed545871-ed70-4d38-830a-8a6131455769" (UID: "ed545871-ed70-4d38-830a-8a6131455769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.598705 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2ffv\" (UniqueName: \"kubernetes.io/projected/ed545871-ed70-4d38-830a-8a6131455769-kube-api-access-k2ffv\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.598979 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.599111 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.599215 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed545871-ed70-4d38-830a-8a6131455769-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.599310 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:15 crc kubenswrapper[4939]: I0318 16:01:15.599411 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed545871-ed70-4d38-830a-8a6131455769-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.003852 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qkn72" event={"ID":"ed545871-ed70-4d38-830a-8a6131455769","Type":"ContainerDied","Data":"2485534b9efbd45946c75134fd955acfc10aeb5318dfb04e27e2538e9fc40655"} Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.003905 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2485534b9efbd45946c75134fd955acfc10aeb5318dfb04e27e2538e9fc40655" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.004012 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qkn72" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.008922 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerStarted","Data":"ae0415a6569a04e1aba064c17f686aa5e3a8470d986e2ac27e1fdff1e7132a29"} Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.276217 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:16 crc kubenswrapper[4939]: E0318 16:01:16.276971 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed545871-ed70-4d38-830a-8a6131455769" containerName="cinder-db-sync" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.276988 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed545871-ed70-4d38-830a-8a6131455769" containerName="cinder-db-sync" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.277175 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed545871-ed70-4d38-830a-8a6131455769" containerName="cinder-db-sync" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.281306 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.286800 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.286882 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.287039 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.287222 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zdtsm" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.299437 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.363918 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.364155 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="dnsmasq-dns" containerID="cri-o://087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a" gracePeriod=10 Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.366439 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.409060 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.411025 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423529 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75lvc\" (UniqueName: \"kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423587 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423646 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423700 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423734 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.423758 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.443163 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.454375 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.456216 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.458111 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.488155 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.524913 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.524974 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525004 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525025 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525050 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525076 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525110 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525126 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5w5\" (UniqueName: \"kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525146 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75lvc\" (UniqueName: \"kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525166 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525185 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.525218 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.527654 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.533036 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.543305 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.544044 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.544052 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.546195 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75lvc\" (UniqueName: \"kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc\") pod \"cinder-scheduler-0\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.614866 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.627889 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.627938 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.627979 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628001 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628036 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628055 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628072 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628104 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628127 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628151 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628165 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpknf\" (UniqueName: \"kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628199 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.628216 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5w5\" (UniqueName: \"kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.629643 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.629888 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.630690 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.631274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.631589 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.652364 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5w5\" (UniqueName: \"kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5\") pod \"dnsmasq-dns-5c9776ccc5-lcsjd\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730246 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730291 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730313 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730401 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpknf\" (UniqueName: \"kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.730481 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.731282 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.732748 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.737089 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.737126 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.740467 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.750743 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.753705 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpknf\" (UniqueName: \"kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf\") pod \"cinder-api-0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.842175 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.915064 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.927162 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.932959 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.933032 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqfz\" (UniqueName: \"kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.933096 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.933196 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.933234 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.933266 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc\") pod \"29a706be-274f-404c-bbca-df92c297d83b\" (UID: \"29a706be-274f-404c-bbca-df92c297d83b\") " Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.952721 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz" (OuterVolumeSpecName: "kube-api-access-mfqfz") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "kube-api-access-mfqfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.997317 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:16 crc kubenswrapper[4939]: I0318 16:01:16.998273 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.000963 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config" (OuterVolumeSpecName: "config") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.003126 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.016158 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29a706be-274f-404c-bbca-df92c297d83b" (UID: "29a706be-274f-404c-bbca-df92c297d83b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.024198 4939 generic.go:334] "Generic (PLEG): container finished" podID="29a706be-274f-404c-bbca-df92c297d83b" containerID="087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a" exitCode=0 Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.024240 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" event={"ID":"29a706be-274f-404c-bbca-df92c297d83b","Type":"ContainerDied","Data":"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a"} Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.024267 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" event={"ID":"29a706be-274f-404c-bbca-df92c297d83b","Type":"ContainerDied","Data":"d2c879f7a450848817f6a1fd3ff51b1914200f0c989c6e82a30663c66a3b6127"} Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.024284 4939 scope.go:117] "RemoveContainer" containerID="087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.024413 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s2qlq" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034659 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034677 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034687 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034695 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034703 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqfz\" (UniqueName: \"kubernetes.io/projected/29a706be-274f-404c-bbca-df92c297d83b-kube-api-access-mfqfz\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.034712 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29a706be-274f-404c-bbca-df92c297d83b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.118710 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.152225 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s2qlq"] Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.155165 4939 scope.go:117] "RemoveContainer" containerID="3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23" Mar 18 16:01:17 crc kubenswrapper[4939]: W0318 16:01:17.167323 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca452f0b_0ab2_4e07_b061_812392f27049.slice/crio-21b455990614ceec28cfb1dbfe64cc69a51520c514b2a9ee363fdde4a42fadfc WatchSource:0}: Error finding container 21b455990614ceec28cfb1dbfe64cc69a51520c514b2a9ee363fdde4a42fadfc: Status 404 returned error can't find the container with id 21b455990614ceec28cfb1dbfe64cc69a51520c514b2a9ee363fdde4a42fadfc Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.183126 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.244049 4939 scope.go:117] "RemoveContainer" containerID="087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a" Mar 18 16:01:17 crc kubenswrapper[4939]: E0318 16:01:17.244890 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a\": container with ID starting with 087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a not found: ID does not exist" containerID="087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.244933 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a"} err="failed to get container status \"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a\": rpc error: code = NotFound desc = could not find container \"087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a\": container with ID starting with 087e28af36c793f7f25873d7440017e9a6160305a09bd98e37158bdd7e15770a not found: ID does not exist" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.244955 4939 scope.go:117] "RemoveContainer" containerID="3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23" Mar 18 16:01:17 crc kubenswrapper[4939]: E0318 16:01:17.245699 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23\": container with ID starting with 3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23 not found: ID does not exist" containerID="3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.245726 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23"} err="failed to get container status \"3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23\": rpc error: code = NotFound desc = could not find container \"3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23\": container with ID starting with 3765cba682961b237a505a7ef2820c6b3c28639ba48081072cd126ed64633e23 not found: ID does not exist" Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.448854 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:01:17 crc kubenswrapper[4939]: I0318 16:01:17.468886 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.057064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerStarted","Data":"21b455990614ceec28cfb1dbfe64cc69a51520c514b2a9ee363fdde4a42fadfc"} Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.059027 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerStarted","Data":"eeedb0814b51f0fcda714ea70ac9069f9037b4d9365c59e0d0e3ce823d4631c7"} Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.073065 4939 generic.go:334] "Generic (PLEG): container finished" podID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerID="6951caea9ceff1764b6a6267761009eeb4fe1ee82aff191f3f729bab18efb226" exitCode=0 Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.073118 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" event={"ID":"d7a2fd0c-9437-43fb-af24-c971bb657d12","Type":"ContainerDied","Data":"6951caea9ceff1764b6a6267761009eeb4fe1ee82aff191f3f729bab18efb226"} Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.073175 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" event={"ID":"d7a2fd0c-9437-43fb-af24-c971bb657d12","Type":"ContainerStarted","Data":"0797f479740a3dd2104d7e04289d3a3dcbc3b21d25648a91602438446f25aa18"} Mar 18 16:01:18 crc kubenswrapper[4939]: I0318 16:01:18.145682 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a706be-274f-404c-bbca-df92c297d83b" path="/var/lib/kubelet/pods/29a706be-274f-404c-bbca-df92c297d83b/volumes" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.047330 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.094771 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" event={"ID":"d7a2fd0c-9437-43fb-af24-c971bb657d12","Type":"ContainerStarted","Data":"3766a4508c68813eb4e6a699ecc370667f5af46a13bf4ef93364692bcdd83176"} Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.094918 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.101086 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerStarted","Data":"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370"} Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.101142 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerStarted","Data":"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372"} Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.101183 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.101197 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api-log" containerID="cri-o://3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372" gracePeriod=30 Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.101227 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api" containerID="cri-o://a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370" gracePeriod=30 Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.108106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerStarted","Data":"39d3607a1f0ebf25806ce9da071107a08a84206d70cb2f9d64281a123d69a63d"} Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.108332 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.142529 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" podStartSLOduration=3.142489233 podStartE2EDuration="3.142489233s" podCreationTimestamp="2026-03-18 16:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:19.12190988 +0000 UTC m=+1443.721097501" watchObservedRunningTime="2026-03-18 16:01:19.142489233 +0000 UTC m=+1443.741676854" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.163276 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.163256897 podStartE2EDuration="3.163256897s" podCreationTimestamp="2026-03-18 16:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:19.148327659 +0000 UTC m=+1443.747515280" watchObservedRunningTime="2026-03-18 16:01:19.163256897 +0000 UTC m=+1443.762444538" Mar 18 16:01:19 crc kubenswrapper[4939]: I0318 16:01:19.169952 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.874314097 podStartE2EDuration="8.16992769s" podCreationTimestamp="2026-03-18 16:01:11 +0000 UTC" firstStartedPulling="2026-03-18 16:01:12.500339547 +0000 UTC m=+1437.099527168" lastFinishedPulling="2026-03-18 16:01:17.79595314 +0000 UTC m=+1442.395140761" observedRunningTime="2026-03-18 16:01:19.169367296 +0000 UTC m=+1443.768554917" watchObservedRunningTime="2026-03-18 16:01:19.16992769 +0000 UTC m=+1443.769115311" Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.118684 4939 generic.go:334] "Generic (PLEG): container finished" podID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerID="3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372" exitCode=143 Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.119232 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerDied","Data":"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372"} Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.121460 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerStarted","Data":"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45"} Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.121603 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerStarted","Data":"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47"} Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.139987 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.098474042 podStartE2EDuration="4.139967688s" podCreationTimestamp="2026-03-18 16:01:16 +0000 UTC" firstStartedPulling="2026-03-18 16:01:17.189109389 +0000 UTC m=+1441.788297010" lastFinishedPulling="2026-03-18 16:01:18.230603035 +0000 UTC m=+1442.829790656" observedRunningTime="2026-03-18 16:01:20.137497528 +0000 UTC m=+1444.736685159" watchObservedRunningTime="2026-03-18 16:01:20.139967688 +0000 UTC m=+1444.739155309" Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.432353 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:20 crc kubenswrapper[4939]: I0318 16:01:20.523258 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:21 crc kubenswrapper[4939]: I0318 16:01:21.616388 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 16:01:22 crc kubenswrapper[4939]: I0318 16:01:22.586301 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:22 crc kubenswrapper[4939]: I0318 16:01:22.649312 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:01:22 crc kubenswrapper[4939]: I0318 16:01:22.764125 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:22 crc kubenswrapper[4939]: I0318 16:01:22.764314 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-665d48dd94-86cx7" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api-log" containerID="cri-o://22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac" gracePeriod=30 Mar 18 16:01:22 crc kubenswrapper[4939]: I0318 16:01:22.764771 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-665d48dd94-86cx7" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api" containerID="cri-o://73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e" gracePeriod=30 Mar 18 16:01:23 crc kubenswrapper[4939]: I0318 16:01:23.156679 4939 generic.go:334] "Generic (PLEG): container finished" podID="e733f9e3-4be0-4265-9404-f88366733500" containerID="22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac" exitCode=143 Mar 18 16:01:23 crc kubenswrapper[4939]: I0318 16:01:23.156771 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerDied","Data":"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac"} Mar 18 16:01:23 crc kubenswrapper[4939]: I0318 16:01:23.687439 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:01:23 crc kubenswrapper[4939]: I0318 16:01:23.687531 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.228328 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-665d48dd94-86cx7" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:48492->10.217.0.166:9311: read: connection reset by peer" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.228961 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-665d48dd94-86cx7" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:48484->10.217.0.166:9311: read: connection reset by peer" Mar 18 16:01:26 crc kubenswrapper[4939]: E0318 16:01:26.313683 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode733f9e3_4be0_4265_9404_f88366733500.slice/crio-conmon-73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.722592 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.853004 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.853649 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom\") pod \"e733f9e3-4be0-4265-9404-f88366733500\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.853723 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgjrr\" (UniqueName: \"kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr\") pod \"e733f9e3-4be0-4265-9404-f88366733500\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.853751 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs\") pod \"e733f9e3-4be0-4265-9404-f88366733500\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.853842 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data\") pod \"e733f9e3-4be0-4265-9404-f88366733500\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.854261 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle\") pod \"e733f9e3-4be0-4265-9404-f88366733500\" (UID: \"e733f9e3-4be0-4265-9404-f88366733500\") " Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.855816 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs" (OuterVolumeSpecName: "logs") pod "e733f9e3-4be0-4265-9404-f88366733500" (UID: "e733f9e3-4be0-4265-9404-f88366733500"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.864835 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e733f9e3-4be0-4265-9404-f88366733500" (UID: "e733f9e3-4be0-4265-9404-f88366733500"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.865170 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr" (OuterVolumeSpecName: "kube-api-access-mgjrr") pod "e733f9e3-4be0-4265-9404-f88366733500" (UID: "e733f9e3-4be0-4265-9404-f88366733500"). InnerVolumeSpecName "kube-api-access-mgjrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.901923 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e733f9e3-4be0-4265-9404-f88366733500" (UID: "e733f9e3-4be0-4265-9404-f88366733500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.902198 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.917633 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.921962 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.925550 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data" (OuterVolumeSpecName: "config-data") pod "e733f9e3-4be0-4265-9404-f88366733500" (UID: "e733f9e3-4be0-4265-9404-f88366733500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.931798 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.958571 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.958600 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.958611 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e733f9e3-4be0-4265-9404-f88366733500-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.958619 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgjrr\" (UniqueName: \"kubernetes.io/projected/e733f9e3-4be0-4265-9404-f88366733500-kube-api-access-mgjrr\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.958631 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e733f9e3-4be0-4265-9404-f88366733500-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.994000 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:01:26 crc kubenswrapper[4939]: I0318 16:01:26.994343 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="dnsmasq-dns" containerID="cri-o://8a901e789f36af56195efb10612dc4cf8cb42d65c23e4bf1d33387c4f7fe29e2" gracePeriod=10 Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.198804 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.204057 4939 generic.go:334] "Generic (PLEG): container finished" podID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerID="8a901e789f36af56195efb10612dc4cf8cb42d65c23e4bf1d33387c4f7fe29e2" exitCode=0 Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.204120 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerDied","Data":"8a901e789f36af56195efb10612dc4cf8cb42d65c23e4bf1d33387c4f7fe29e2"} Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.205779 4939 generic.go:334] "Generic (PLEG): container finished" podID="e733f9e3-4be0-4265-9404-f88366733500" containerID="73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e" exitCode=0 Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.206694 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665d48dd94-86cx7" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.213500 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerDied","Data":"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e"} Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.213577 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665d48dd94-86cx7" event={"ID":"e733f9e3-4be0-4265-9404-f88366733500","Type":"ContainerDied","Data":"8cd0c7eb7e5448d55678f8b3b8c60a3d3a3f208888492eef1dcdc9b44dec3f84"} Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.213593 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.214006 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214022 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api" Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.214054 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="init" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214062 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="init" Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.214073 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api-log" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214082 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api-log" Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.214101 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="dnsmasq-dns" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214109 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="dnsmasq-dns" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214303 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214312 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e733f9e3-4be0-4265-9404-f88366733500" containerName="barbican-api-log" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.214337 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a706be-274f-404c-bbca-df92c297d83b" containerName="dnsmasq-dns" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.215402 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.216070 4939 scope.go:117] "RemoveContainer" containerID="73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.217956 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="cinder-scheduler" containerID="cri-o://83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47" gracePeriod=30 Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.218130 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="probe" containerID="cri-o://bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45" gracePeriod=30 Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.232040 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.289239 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.289654 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7dcffbdd64-d9zm4" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.290958 4939 scope.go:117] "RemoveContainer" containerID="22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.297393 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-665d48dd94-86cx7"] Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.366725 4939 scope.go:117] "RemoveContainer" containerID="73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e" Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.371616 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e\": container with ID starting with 73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e not found: ID does not exist" containerID="73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.371664 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e"} err="failed to get container status \"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e\": rpc error: code = NotFound desc = could not find container \"73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e\": container with ID starting with 73f7908bc294f870e2dee509a7327de28b328e9c7df45bb417af05179b6c297e not found: ID does not exist" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.371689 4939 scope.go:117] "RemoveContainer" containerID="22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac" Mar 18 16:01:27 crc kubenswrapper[4939]: E0318 16:01:27.375139 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac\": container with ID starting with 22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac not found: ID does not exist" containerID="22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.375166 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac"} err="failed to get container status \"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac\": rpc error: code = NotFound desc = could not find container \"22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac\": container with ID starting with 22a3d31d8d79744fc95b09b160f2662f3544d51de960873cbb68c3eaf8b923ac not found: ID does not exist" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376781 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376822 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376851 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376898 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376928 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48zb\" (UniqueName: \"kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376958 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.376992 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.439628 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479478 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479666 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479699 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479732 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479782 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479823 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n48zb\" (UniqueName: \"kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.479862 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.484287 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.490205 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.501688 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.506440 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.507780 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.507943 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.531810 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48zb\" (UniqueName: \"kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb\") pod \"placement-576956754b-kspq2\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.537300 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.651891 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786341 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786394 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786551 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkwm5\" (UniqueName: \"kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786619 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786635 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.786660 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb\") pod \"d234e54d-b414-46be-8668-a3eeb33c7f03\" (UID: \"d234e54d-b414-46be-8668-a3eeb33c7f03\") " Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.796639 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5" (OuterVolumeSpecName: "kube-api-access-xkwm5") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "kube-api-access-xkwm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.883379 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.893597 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.893630 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkwm5\" (UniqueName: \"kubernetes.io/projected/d234e54d-b414-46be-8668-a3eeb33c7f03-kube-api-access-xkwm5\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.897904 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.899031 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config" (OuterVolumeSpecName: "config") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.905884 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.916875 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d234e54d-b414-46be-8668-a3eeb33c7f03" (UID: "d234e54d-b414-46be-8668-a3eeb33c7f03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.996185 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.996217 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.996228 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:27 crc kubenswrapper[4939]: I0318 16:01:27.996238 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d234e54d-b414-46be-8668-a3eeb33c7f03-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:28 crc kubenswrapper[4939]: W0318 16:01:28.140521 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf7cba1f_8d56_47c9_8016_3184a1374386.slice/crio-bf4890f5149a26c46143ec38e6c1c16ded9f83634e51531317a6a653ef4fa6fa WatchSource:0}: Error finding container bf4890f5149a26c46143ec38e6c1c16ded9f83634e51531317a6a653ef4fa6fa: Status 404 returned error can't find the container with id bf4890f5149a26c46143ec38e6c1c16ded9f83634e51531317a6a653ef4fa6fa Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.145081 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e733f9e3-4be0-4265-9404-f88366733500" path="/var/lib/kubelet/pods/e733f9e3-4be0-4265-9404-f88366733500/volumes" Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.145829 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.220366 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" event={"ID":"d234e54d-b414-46be-8668-a3eeb33c7f03","Type":"ContainerDied","Data":"0ed0c36e8bd9a5464868000f1c1cdf5e76284a717c0c9165495ff444adc10da4"} Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.220422 4939 scope.go:117] "RemoveContainer" containerID="8a901e789f36af56195efb10612dc4cf8cb42d65c23e4bf1d33387c4f7fe29e2" Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.220567 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8shzv" Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.226357 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca452f0b-0ab2-4e07-b061-812392f27049" containerID="bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45" exitCode=0 Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.226428 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerDied","Data":"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45"} Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.233449 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerStarted","Data":"bf4890f5149a26c46143ec38e6c1c16ded9f83634e51531317a6a653ef4fa6fa"} Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.249714 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.252322 4939 scope.go:117] "RemoveContainer" containerID="623d50fc40a8298c4707b45313dcbc936441c2236b9dad676567394d2a7fe1ff" Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.258688 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8shzv"] Mar 18 16:01:28 crc kubenswrapper[4939]: I0318 16:01:28.832143 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.244195 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerStarted","Data":"ee0d7673467ab34937d02099e23c0b13b6599f673de277c77448b47b6c8d53d7"} Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.244533 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerStarted","Data":"c4b1ae5bcdd7929e16516d94f1d8e93e3c240e0dba2fbbe3ab7b4b2d344bbbb5"} Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.244613 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.275326 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-576956754b-kspq2" podStartSLOduration=2.275306796 podStartE2EDuration="2.275306796s" podCreationTimestamp="2026-03-18 16:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:29.269198428 +0000 UTC m=+1453.868386049" watchObservedRunningTime="2026-03-18 16:01:29.275306796 +0000 UTC m=+1453.874494417" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.295465 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.688836 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 16:01:29 crc kubenswrapper[4939]: E0318 16:01:29.689284 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="dnsmasq-dns" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.689306 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="dnsmasq-dns" Mar 18 16:01:29 crc kubenswrapper[4939]: E0318 16:01:29.689331 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="init" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.689341 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="init" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.689696 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" containerName="dnsmasq-dns" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.690415 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.694008 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.694680 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nbxq4" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.699979 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.711032 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.838950 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.839102 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.839250 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.839299 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzjm\" (UniqueName: \"kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.940644 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.940706 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzjm\" (UniqueName: \"kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.940809 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.940850 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.941713 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.945403 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.945494 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:29 crc kubenswrapper[4939]: I0318 16:01:29.955453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzjm\" (UniqueName: \"kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm\") pod \"openstackclient\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " pod="openstack/openstackclient" Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.032966 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.146733 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d234e54d-b414-46be-8668-a3eeb33c7f03" path="/var/lib/kubelet/pods/d234e54d-b414-46be-8668-a3eeb33c7f03/volumes" Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.261127 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.557609 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.572464 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.646300 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.646859 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b8f86c468-qlp5n" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-api" containerID="cri-o://75cb3253f0a799e23784270d1f152480e5a674daa02ecb8e6df7c31fc17d5e37" gracePeriod=30 Mar 18 16:01:30 crc kubenswrapper[4939]: I0318 16:01:30.647807 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b8f86c468-qlp5n" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-httpd" containerID="cri-o://b23c9f2c3b2d19c67b54836359307e59dd6daff6f0021c175f1c13875fd01dc3" gracePeriod=30 Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.128047 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265060 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265208 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265241 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265266 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75lvc\" (UniqueName: \"kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265372 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle\") pod \"ca452f0b-0ab2-4e07-b061-812392f27049\" (UID: \"ca452f0b-0ab2-4e07-b061-812392f27049\") " Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265371 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.265876 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca452f0b-0ab2-4e07-b061-812392f27049-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.271818 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts" (OuterVolumeSpecName: "scripts") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.273860 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc" (OuterVolumeSpecName: "kube-api-access-75lvc") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "kube-api-access-75lvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.274729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.288417 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca452f0b-0ab2-4e07-b061-812392f27049" containerID="83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47" exitCode=0 Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.288495 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerDied","Data":"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47"} Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.288536 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca452f0b-0ab2-4e07-b061-812392f27049","Type":"ContainerDied","Data":"21b455990614ceec28cfb1dbfe64cc69a51520c514b2a9ee363fdde4a42fadfc"} Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.288553 4939 scope.go:117] "RemoveContainer" containerID="bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.288679 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.299711 4939 generic.go:334] "Generic (PLEG): container finished" podID="73e2003e-c000-4135-9c1e-556cae29d832" containerID="b23c9f2c3b2d19c67b54836359307e59dd6daff6f0021c175f1c13875fd01dc3" exitCode=0 Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.299775 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerDied","Data":"b23c9f2c3b2d19c67b54836359307e59dd6daff6f0021c175f1c13875fd01dc3"} Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.303128 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"08510a7a-ad57-44a4-9089-7558c213284b","Type":"ContainerStarted","Data":"a37060b1d32e3def07e9adceccadce09982b8c43a62ec6773665ad28956361bd"} Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.347625 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.367015 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75lvc\" (UniqueName: \"kubernetes.io/projected/ca452f0b-0ab2-4e07-b061-812392f27049-kube-api-access-75lvc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.367038 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.367049 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.367057 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.382465 4939 scope.go:117] "RemoveContainer" containerID="83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.387805 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data" (OuterVolumeSpecName: "config-data") pod "ca452f0b-0ab2-4e07-b061-812392f27049" (UID: "ca452f0b-0ab2-4e07-b061-812392f27049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.421921 4939 scope.go:117] "RemoveContainer" containerID="bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45" Mar 18 16:01:31 crc kubenswrapper[4939]: E0318 16:01:31.422448 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45\": container with ID starting with bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45 not found: ID does not exist" containerID="bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.422481 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45"} err="failed to get container status \"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45\": rpc error: code = NotFound desc = could not find container \"bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45\": container with ID starting with bb84b83f7fd0e48123c9d7d0dc34bd06c6924b484af7075a85e9ca7c44991f45 not found: ID does not exist" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.422629 4939 scope.go:117] "RemoveContainer" containerID="83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47" Mar 18 16:01:31 crc kubenswrapper[4939]: E0318 16:01:31.423014 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47\": container with ID starting with 83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47 not found: ID does not exist" containerID="83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.423791 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47"} err="failed to get container status \"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47\": rpc error: code = NotFound desc = could not find container \"83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47\": container with ID starting with 83df07bd38a6dda9fa0d28ca008ec06d44bc2200e00ec9ac1294a8649da66a47 not found: ID does not exist" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.469338 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca452f0b-0ab2-4e07-b061-812392f27049-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.626518 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.634184 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.657547 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:31 crc kubenswrapper[4939]: E0318 16:01:31.657976 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="probe" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.657993 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="probe" Mar 18 16:01:31 crc kubenswrapper[4939]: E0318 16:01:31.658010 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="cinder-scheduler" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.658018 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="cinder-scheduler" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.658164 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="probe" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.658185 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" containerName="cinder-scheduler" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.659111 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.663137 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.663730 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.786812 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.786946 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.786998 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.787091 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.787143 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75kh\" (UniqueName: \"kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.787169 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.888866 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.888913 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f75kh\" (UniqueName: \"kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.888952 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.889024 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.889059 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.889221 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.889605 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.896458 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.900093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.901002 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.905164 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:31 crc kubenswrapper[4939]: I0318 16:01:31.910664 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75kh\" (UniqueName: \"kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh\") pod \"cinder-scheduler-0\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " pod="openstack/cinder-scheduler-0" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:31.999712 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.189374 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca452f0b-0ab2-4e07-b061-812392f27049" path="/var/lib/kubelet/pods/ca452f0b-0ab2-4e07-b061-812392f27049/volumes" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.366097 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dcffbdd64-d9zm4_7859058e-a736-4065-bb79-8be528d5a709/neutron-api/0.log" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.366388 4939 generic.go:334] "Generic (PLEG): container finished" podID="7859058e-a736-4065-bb79-8be528d5a709" containerID="1352f04be604c8cd3fdc9b3e4b2d58820675611e8951fd73ba5ae7fd614d71dd" exitCode=137 Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.366419 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerDied","Data":"1352f04be604c8cd3fdc9b3e4b2d58820675611e8951fd73ba5ae7fd614d71dd"} Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.366323 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dcffbdd64-d9zm4_7859058e-a736-4065-bb79-8be528d5a709/neutron-api/0.log" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.366735 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.508174 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config\") pod \"7859058e-a736-4065-bb79-8be528d5a709\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.508285 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs\") pod \"7859058e-a736-4065-bb79-8be528d5a709\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.508333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config\") pod \"7859058e-a736-4065-bb79-8be528d5a709\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.508381 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4vx6\" (UniqueName: \"kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6\") pod \"7859058e-a736-4065-bb79-8be528d5a709\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.508408 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle\") pod \"7859058e-a736-4065-bb79-8be528d5a709\" (UID: \"7859058e-a736-4065-bb79-8be528d5a709\") " Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.517295 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7859058e-a736-4065-bb79-8be528d5a709" (UID: "7859058e-a736-4065-bb79-8be528d5a709"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.517322 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6" (OuterVolumeSpecName: "kube-api-access-f4vx6") pod "7859058e-a736-4065-bb79-8be528d5a709" (UID: "7859058e-a736-4065-bb79-8be528d5a709"). InnerVolumeSpecName "kube-api-access-f4vx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.568048 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7859058e-a736-4065-bb79-8be528d5a709" (UID: "7859058e-a736-4065-bb79-8be528d5a709"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.575222 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config" (OuterVolumeSpecName: "config") pod "7859058e-a736-4065-bb79-8be528d5a709" (UID: "7859058e-a736-4065-bb79-8be528d5a709"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.583709 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7859058e-a736-4065-bb79-8be528d5a709" (UID: "7859058e-a736-4065-bb79-8be528d5a709"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.613399 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.613434 4939 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.613447 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.613459 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4vx6\" (UniqueName: \"kubernetes.io/projected/7859058e-a736-4065-bb79-8be528d5a709-kube-api-access-f4vx6\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.613469 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7859058e-a736-4065-bb79-8be528d5a709-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4939]: I0318 16:01:32.662012 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:01:32 crc kubenswrapper[4939]: W0318 16:01:32.669620 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf757e65c_c660_4614_bb43_38b9beb092e9.slice/crio-52fb719105824bb08514ff4a0a0b03b169d0157166d7cbf6420c05cfbd2025c6 WatchSource:0}: Error finding container 52fb719105824bb08514ff4a0a0b03b169d0157166d7cbf6420c05cfbd2025c6: Status 404 returned error can't find the container with id 52fb719105824bb08514ff4a0a0b03b169d0157166d7cbf6420c05cfbd2025c6 Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.403144 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerStarted","Data":"abca7abb8af59dd80a723a07cc0e9834ccc7d6891529372d20c592078a41f166"} Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.404795 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerStarted","Data":"52fb719105824bb08514ff4a0a0b03b169d0157166d7cbf6420c05cfbd2025c6"} Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.411802 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7dcffbdd64-d9zm4_7859058e-a736-4065-bb79-8be528d5a709/neutron-api/0.log" Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.411860 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dcffbdd64-d9zm4" event={"ID":"7859058e-a736-4065-bb79-8be528d5a709","Type":"ContainerDied","Data":"a5788169ea06823ef4783a0fc98462e542e152b3de83f941de58cbfcf2e5012d"} Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.411898 4939 scope.go:117] "RemoveContainer" containerID="bc597c857f5f728e5137606cedc139017a2b4a9cbbad872803bf70dda3d4ea6f" Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.412033 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dcffbdd64-d9zm4" Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.486039 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.486101 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7dcffbdd64-d9zm4"] Mar 18 16:01:33 crc kubenswrapper[4939]: I0318 16:01:33.519025 4939 scope.go:117] "RemoveContainer" containerID="1352f04be604c8cd3fdc9b3e4b2d58820675611e8951fd73ba5ae7fd614d71dd" Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.163859 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7859058e-a736-4065-bb79-8be528d5a709" path="/var/lib/kubelet/pods/7859058e-a736-4065-bb79-8be528d5a709/volumes" Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.425055 4939 generic.go:334] "Generic (PLEG): container finished" podID="73e2003e-c000-4135-9c1e-556cae29d832" containerID="75cb3253f0a799e23784270d1f152480e5a674daa02ecb8e6df7c31fc17d5e37" exitCode=0 Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.425144 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerDied","Data":"75cb3253f0a799e23784270d1f152480e5a674daa02ecb8e6df7c31fc17d5e37"} Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.428960 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerStarted","Data":"c87e18d10fc916c7b05bb350ccbb835b08683349a3ae8b07119f660954350f76"} Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.452398 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.45238178 podStartE2EDuration="3.45238178s" podCreationTimestamp="2026-03-18 16:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:34.44739497 +0000 UTC m=+1459.046582591" watchObservedRunningTime="2026-03-18 16:01:34.45238178 +0000 UTC m=+1459.051569401" Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.948074 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.979957 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config\") pod \"73e2003e-c000-4135-9c1e-556cae29d832\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.980162 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config\") pod \"73e2003e-c000-4135-9c1e-556cae29d832\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.980196 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs\") pod \"73e2003e-c000-4135-9c1e-556cae29d832\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.980310 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle\") pod \"73e2003e-c000-4135-9c1e-556cae29d832\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.980377 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vngvt\" (UniqueName: \"kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt\") pod \"73e2003e-c000-4135-9c1e-556cae29d832\" (UID: \"73e2003e-c000-4135-9c1e-556cae29d832\") " Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.988643 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "73e2003e-c000-4135-9c1e-556cae29d832" (UID: "73e2003e-c000-4135-9c1e-556cae29d832"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:34 crc kubenswrapper[4939]: I0318 16:01:34.993190 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt" (OuterVolumeSpecName: "kube-api-access-vngvt") pod "73e2003e-c000-4135-9c1e-556cae29d832" (UID: "73e2003e-c000-4135-9c1e-556cae29d832"). InnerVolumeSpecName "kube-api-access-vngvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.050598 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config" (OuterVolumeSpecName: "config") pod "73e2003e-c000-4135-9c1e-556cae29d832" (UID: "73e2003e-c000-4135-9c1e-556cae29d832"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.066718 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73e2003e-c000-4135-9c1e-556cae29d832" (UID: "73e2003e-c000-4135-9c1e-556cae29d832"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.082877 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.082904 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.082913 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.082924 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vngvt\" (UniqueName: \"kubernetes.io/projected/73e2003e-c000-4135-9c1e-556cae29d832-kube-api-access-vngvt\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.093117 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "73e2003e-c000-4135-9c1e-556cae29d832" (UID: "73e2003e-c000-4135-9c1e-556cae29d832"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.185192 4939 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73e2003e-c000-4135-9c1e-556cae29d832-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.452746 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f86c468-qlp5n" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.453829 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f86c468-qlp5n" event={"ID":"73e2003e-c000-4135-9c1e-556cae29d832","Type":"ContainerDied","Data":"9c838307b1924d5cb4ac8f001bd93aef896f2657e89d2dea39c4f605b31ef322"} Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.453882 4939 scope.go:117] "RemoveContainer" containerID="b23c9f2c3b2d19c67b54836359307e59dd6daff6f0021c175f1c13875fd01dc3" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.479288 4939 scope.go:117] "RemoveContainer" containerID="75cb3253f0a799e23784270d1f152480e5a674daa02ecb8e6df7c31fc17d5e37" Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.514952 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.524171 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b8f86c468-qlp5n"] Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.667983 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.668320 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-central-agent" containerID="cri-o://2e3d2425aeb8f6c2bb47c43db4623498bf75e4e1e20f1c25eba67d5a53834933" gracePeriod=30 Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.668382 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="sg-core" containerID="cri-o://ae0415a6569a04e1aba064c17f686aa5e3a8470d986e2ac27e1fdff1e7132a29" gracePeriod=30 Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.668454 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-notification-agent" containerID="cri-o://ebaf37379702245950f544a2b4c16f11d1d7778ab4ef76c5cd3a1dff9069948b" gracePeriod=30 Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.668513 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="proxy-httpd" containerID="cri-o://39d3607a1f0ebf25806ce9da071107a08a84206d70cb2f9d64281a123d69a63d" gracePeriod=30 Mar 18 16:01:35 crc kubenswrapper[4939]: I0318 16:01:35.686866 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.178619 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e2003e-c000-4135-9c1e-556cae29d832" path="/var/lib/kubelet/pods/73e2003e-c000-4135-9c1e-556cae29d832/volumes" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.409366 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:01:36 crc kubenswrapper[4939]: E0318 16:01:36.411016 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411043 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: E0318 16:01:36.411064 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411073 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: E0318 16:01:36.411103 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411111 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: E0318 16:01:36.411127 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411134 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411329 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411361 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-httpd" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411377 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7859058e-a736-4065-bb79-8be528d5a709" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.411397 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e2003e-c000-4135-9c1e-556cae29d832" containerName="neutron-api" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.414003 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.418289 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.427653 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.434423 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.436559 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464121 4939 generic.go:334] "Generic (PLEG): container finished" podID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerID="39d3607a1f0ebf25806ce9da071107a08a84206d70cb2f9d64281a123d69a63d" exitCode=0 Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464164 4939 generic.go:334] "Generic (PLEG): container finished" podID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerID="ae0415a6569a04e1aba064c17f686aa5e3a8470d986e2ac27e1fdff1e7132a29" exitCode=2 Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464176 4939 generic.go:334] "Generic (PLEG): container finished" podID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerID="2e3d2425aeb8f6c2bb47c43db4623498bf75e4e1e20f1c25eba67d5a53834933" exitCode=0 Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464224 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerDied","Data":"39d3607a1f0ebf25806ce9da071107a08a84206d70cb2f9d64281a123d69a63d"} Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464256 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerDied","Data":"ae0415a6569a04e1aba064c17f686aa5e3a8470d986e2ac27e1fdff1e7132a29"} Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.464272 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerDied","Data":"2e3d2425aeb8f6c2bb47c43db4623498bf75e4e1e20f1c25eba67d5a53834933"} Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.511974 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512065 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512127 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512234 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512272 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512336 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.512354 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613533 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613594 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613640 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613655 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613675 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613743 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613764 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.613792 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.614179 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.614739 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.621675 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.622010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.623064 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.623709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.624666 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.642008 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl\") pod \"swift-proxy-7bb7666d55-9qg76\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.754485 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.783254 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-s2smr"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.784301 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.795811 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-s2smr"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.818033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.818107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkwc\" (UniqueName: \"kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.893349 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bwc5l"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.895818 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.920717 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.920756 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkwc\" (UniqueName: \"kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.920779 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px87v\" (UniqueName: \"kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.920899 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.921700 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.922556 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-409c-account-create-update-qhm94"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.923612 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.925454 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.940722 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bwc5l"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.948474 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-409c-account-create-update-qhm94"] Mar 18 16:01:36 crc kubenswrapper[4939]: I0318 16:01:36.964975 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkwc\" (UniqueName: \"kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc\") pod \"nova-api-db-create-s2smr\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.000681 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.021803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.021945 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.021973 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px87v\" (UniqueName: \"kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.021997 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qgcm\" (UniqueName: \"kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.023879 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.028553 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bzpsk"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.029688 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.041216 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px87v\" (UniqueName: \"kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v\") pod \"nova-cell0-db-create-bwc5l\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.045547 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bzpsk"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.096146 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-b75kg"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.097590 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.100761 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.114574 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-b75kg"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123163 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdgnt\" (UniqueName: \"kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123226 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123303 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123376 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qgcm\" (UniqueName: \"kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123399 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzrq\" (UniqueName: \"kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.123421 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.124130 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.140323 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qgcm\" (UniqueName: \"kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm\") pod \"nova-api-409c-account-create-update-qhm94\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.186513 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.222038 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.228332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzrq\" (UniqueName: \"kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.228385 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.228417 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdgnt\" (UniqueName: \"kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.228552 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.229239 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.231756 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.252241 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzrq\" (UniqueName: \"kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq\") pod \"nova-cell0-8faa-account-create-update-b75kg\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.253291 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdgnt\" (UniqueName: \"kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt\") pod \"nova-cell1-db-create-bzpsk\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.269529 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.294005 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-hcnbj"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.295462 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.297674 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.318602 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-hcnbj"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.335901 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.336198 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfcpn\" (UniqueName: \"kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.358439 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.420969 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.437918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfcpn\" (UniqueName: \"kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.438390 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.439201 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.461981 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfcpn\" (UniqueName: \"kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn\") pod \"nova-cell1-f9fa-account-create-update-hcnbj\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.487223 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.515876 4939 generic.go:334] "Generic (PLEG): container finished" podID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerID="ebaf37379702245950f544a2b4c16f11d1d7778ab4ef76c5cd3a1dff9069948b" exitCode=0 Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.515948 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerDied","Data":"ebaf37379702245950f544a2b4c16f11d1d7778ab4ef76c5cd3a1dff9069948b"} Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.563040 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.620611 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.642083 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.642163 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.642192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.642229 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzjn4\" (UniqueName: \"kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.642281 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.644072 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.644125 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts\") pod \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\" (UID: \"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69\") " Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.647275 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.647899 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.650362 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts" (OuterVolumeSpecName: "scripts") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.653659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4" (OuterVolumeSpecName: "kube-api-access-fzjn4") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "kube-api-access-fzjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.746840 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.747154 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzjn4\" (UniqueName: \"kubernetes.io/projected/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-kube-api-access-fzjn4\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.747255 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.747334 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.749545 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.813846 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: W0318 16:01:37.832662 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0eb0f89_9573_4e53_a22c_16b8cd80140a.slice/crio-6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b WatchSource:0}: Error finding container 6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b: Status 404 returned error can't find the container with id 6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.836076 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bwc5l"] Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.849668 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.849699 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.892016 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data" (OuterVolumeSpecName: "config-data") pod "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" (UID: "c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:37 crc kubenswrapper[4939]: I0318 16:01:37.951453 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.017868 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-s2smr"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.194734 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bzpsk"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.201129 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-b75kg"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.211024 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-409c-account-create-update-qhm94"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.374335 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-hcnbj"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.541001 4939 generic.go:334] "Generic (PLEG): container finished" podID="f0eb0f89-9573-4e53-a22c-16b8cd80140a" containerID="d1bbd88198e7c27fc2ae103dd4491725bc76a76487770c0862669d3280d0117a" exitCode=0 Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.541071 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bwc5l" event={"ID":"f0eb0f89-9573-4e53-a22c-16b8cd80140a","Type":"ContainerDied","Data":"d1bbd88198e7c27fc2ae103dd4491725bc76a76487770c0862669d3280d0117a"} Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.541105 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bwc5l" event={"ID":"f0eb0f89-9573-4e53-a22c-16b8cd80140a","Type":"ContainerStarted","Data":"6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b"} Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.563198 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69","Type":"ContainerDied","Data":"5a50f2dc74864b787341c8d901c66456f19a5a1a625cc1c7fd73ab333113cefe"} Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.563254 4939 scope.go:117] "RemoveContainer" containerID="39d3607a1f0ebf25806ce9da071107a08a84206d70cb2f9d64281a123d69a63d" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.563379 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.578196 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerStarted","Data":"c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac"} Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.578245 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerStarted","Data":"0be59d410e3072fb69049312b96d052f189a9925b1b54535e62738cf2f5cc173"} Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.596716 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.615498 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634155 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:38 crc kubenswrapper[4939]: E0318 16:01:38.634646 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-central-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634674 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-central-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: E0318 16:01:38.634690 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="proxy-httpd" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634699 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="proxy-httpd" Mar 18 16:01:38 crc kubenswrapper[4939]: E0318 16:01:38.634725 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="sg-core" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634732 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="sg-core" Mar 18 16:01:38 crc kubenswrapper[4939]: E0318 16:01:38.634751 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-notification-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634756 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-notification-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634913 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="proxy-httpd" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634925 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-central-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634944 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="sg-core" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.634953 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" containerName="ceilometer-notification-agent" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.636886 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.639873 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.640035 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.643966 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.662039 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.662233 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.662697 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.662808 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.662991 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.663055 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.663139 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbmzc\" (UniqueName: \"kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765095 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765137 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765161 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765179 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765283 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765301 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbmzc\" (UniqueName: \"kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.765821 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.766217 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.773442 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.775002 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.786278 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.790678 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.794275 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbmzc\" (UniqueName: \"kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc\") pod \"ceilometer-0\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " pod="openstack/ceilometer-0" Mar 18 16:01:38 crc kubenswrapper[4939]: I0318 16:01:38.985673 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:40 crc kubenswrapper[4939]: I0318 16:01:40.142806 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69" path="/var/lib/kubelet/pods/c915f86b-8ed2-4d9c-8185-4f9a2a5f5f69/volumes" Mar 18 16:01:42 crc kubenswrapper[4939]: I0318 16:01:42.223602 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 16:01:44 crc kubenswrapper[4939]: W0318 16:01:44.105645 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd301c5c5_c1cf_4037_aafb_612fbbe133f7.slice/crio-41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4 WatchSource:0}: Error finding container 41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4: Status 404 returned error can't find the container with id 41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4 Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.110975 4939 scope.go:117] "RemoveContainer" containerID="ae0415a6569a04e1aba064c17f686aa5e3a8470d986e2ac27e1fdff1e7132a29" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.192135 4939 scope.go:117] "RemoveContainer" containerID="ebaf37379702245950f544a2b4c16f11d1d7778ab4ef76c5cd3a1dff9069948b" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.238288 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.282612 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts\") pod \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.282834 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px87v\" (UniqueName: \"kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v\") pod \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\" (UID: \"f0eb0f89-9573-4e53-a22c-16b8cd80140a\") " Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.283820 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0eb0f89-9573-4e53-a22c-16b8cd80140a" (UID: "f0eb0f89-9573-4e53-a22c-16b8cd80140a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.289571 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v" (OuterVolumeSpecName: "kube-api-access-px87v") pod "f0eb0f89-9573-4e53-a22c-16b8cd80140a" (UID: "f0eb0f89-9573-4e53-a22c-16b8cd80140a"). InnerVolumeSpecName "kube-api-access-px87v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.325213 4939 scope.go:117] "RemoveContainer" containerID="2e3d2425aeb8f6c2bb47c43db4623498bf75e4e1e20f1c25eba67d5a53834933" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.384890 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0eb0f89-9573-4e53-a22c-16b8cd80140a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.384915 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px87v\" (UniqueName: \"kubernetes.io/projected/f0eb0f89-9573-4e53-a22c-16b8cd80140a-kube-api-access-px87v\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.650104 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerStarted","Data":"8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.651707 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.651732 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.655046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-409c-account-create-update-qhm94" event={"ID":"d301c5c5-c1cf-4037-aafb-612fbbe133f7","Type":"ContainerStarted","Data":"41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.665437 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s2smr" event={"ID":"03137eb0-6a57-4dc2-91aa-e7af80abbd22","Type":"ContainerStarted","Data":"34deb1738ad27aacb1bf83653f4b32681d6750d1b187c50b02bf3fa5d66457ed"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.670678 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" event={"ID":"7a302eb7-0f61-497d-96df-59aacc8d463f","Type":"ContainerStarted","Data":"e011b5f7939fb9a5fbc3d780c7d87c51184687eb173cab950fa247dbeb26a92f"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.679934 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7bb7666d55-9qg76" podStartSLOduration=8.679916735 podStartE2EDuration="8.679916735s" podCreationTimestamp="2026-03-18 16:01:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:44.678216422 +0000 UTC m=+1469.277404083" watchObservedRunningTime="2026-03-18 16:01:44.679916735 +0000 UTC m=+1469.279104356" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.680054 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" event={"ID":"7d40f23c-fece-48e9-a70f-7b1309600baa","Type":"ContainerStarted","Data":"8bed3f68519633648f8f64bd3c401cab3bdd461acca5d5a74ae800a8d05d90ad"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.688116 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7bb7666d55-9qg76" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.691459 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bwc5l" event={"ID":"f0eb0f89-9573-4e53-a22c-16b8cd80140a","Type":"ContainerDied","Data":"6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.691628 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0d4d10c7b29294f32eecea517ff42a9211f0069ef58d4737251eb426a78b7b" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.691494 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bwc5l" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.695103 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bzpsk" event={"ID":"de888787-cfa6-46be-bb5b-a45e06eddb1b","Type":"ContainerStarted","Data":"54f847d820c4484409bda5c3b12fbb96a6bda9d2d952f1964c605f5a84ef467a"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.695160 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bzpsk" event={"ID":"de888787-cfa6-46be-bb5b-a45e06eddb1b","Type":"ContainerStarted","Data":"6d9565a39948f1bc2c157942dfeb4a7770ce4037503f9932904f1c1c2015f1c4"} Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.723300 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-bzpsk" podStartSLOduration=8.723276799 podStartE2EDuration="8.723276799s" podCreationTimestamp="2026-03-18 16:01:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:44.707884127 +0000 UTC m=+1469.307071748" watchObservedRunningTime="2026-03-18 16:01:44.723276799 +0000 UTC m=+1469.322464420" Mar 18 16:01:44 crc kubenswrapper[4939]: I0318 16:01:44.753150 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.707806 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerStarted","Data":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.708084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerStarted","Data":"4d05ad71751b6ecc6e9fe02073e66d453d558acbbec6ff5132aff4f9d997aef5"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.711257 4939 generic.go:334] "Generic (PLEG): container finished" podID="d301c5c5-c1cf-4037-aafb-612fbbe133f7" containerID="02c119c7d1b12600535f978f2ef210454ef30c1c70e312e01eea669577780dc3" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.711390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-409c-account-create-update-qhm94" event={"ID":"d301c5c5-c1cf-4037-aafb-612fbbe133f7","Type":"ContainerDied","Data":"02c119c7d1b12600535f978f2ef210454ef30c1c70e312e01eea669577780dc3"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.713954 4939 generic.go:334] "Generic (PLEG): container finished" podID="03137eb0-6a57-4dc2-91aa-e7af80abbd22" containerID="fb26664c79478bc866dc4d04f4eafcb037e12fdf6ea1349c4fae4d7c7ec04ee6" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.714100 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s2smr" event={"ID":"03137eb0-6a57-4dc2-91aa-e7af80abbd22","Type":"ContainerDied","Data":"fb26664c79478bc866dc4d04f4eafcb037e12fdf6ea1349c4fae4d7c7ec04ee6"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.715919 4939 generic.go:334] "Generic (PLEG): container finished" podID="7a302eb7-0f61-497d-96df-59aacc8d463f" containerID="84f3f6c9cdd2155af79eaa3c0c011995f8be1962da95118acf91db1cd01c6852" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.715989 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" event={"ID":"7a302eb7-0f61-497d-96df-59aacc8d463f","Type":"ContainerDied","Data":"84f3f6c9cdd2155af79eaa3c0c011995f8be1962da95118acf91db1cd01c6852"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.718258 4939 generic.go:334] "Generic (PLEG): container finished" podID="7d40f23c-fece-48e9-a70f-7b1309600baa" containerID="209907d68aeaadea75ff4baa4dcf0cf4485c21079c579735529d0faeecba1e43" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.718400 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" event={"ID":"7d40f23c-fece-48e9-a70f-7b1309600baa","Type":"ContainerDied","Data":"209907d68aeaadea75ff4baa4dcf0cf4485c21079c579735529d0faeecba1e43"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.720373 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"08510a7a-ad57-44a4-9089-7558c213284b","Type":"ContainerStarted","Data":"3e958f60a62d00010cc8b06118e35abed1735c3a72b9d800113fbbaacdb1ee62"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.722038 4939 generic.go:334] "Generic (PLEG): container finished" podID="de888787-cfa6-46be-bb5b-a45e06eddb1b" containerID="54f847d820c4484409bda5c3b12fbb96a6bda9d2d952f1964c605f5a84ef467a" exitCode=0 Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.722085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bzpsk" event={"ID":"de888787-cfa6-46be-bb5b-a45e06eddb1b","Type":"ContainerDied","Data":"54f847d820c4484409bda5c3b12fbb96a6bda9d2d952f1964c605f5a84ef467a"} Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.762005 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.042347668 podStartE2EDuration="16.761984681s" podCreationTimestamp="2026-03-18 16:01:29 +0000 UTC" firstStartedPulling="2026-03-18 16:01:30.567250137 +0000 UTC m=+1455.166437758" lastFinishedPulling="2026-03-18 16:01:44.28688715 +0000 UTC m=+1468.886074771" observedRunningTime="2026-03-18 16:01:45.759791663 +0000 UTC m=+1470.358979284" watchObservedRunningTime="2026-03-18 16:01:45.761984681 +0000 UTC m=+1470.361172292" Mar 18 16:01:45 crc kubenswrapper[4939]: I0318 16:01:45.877043 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:46 crc kubenswrapper[4939]: I0318 16:01:46.751768 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerStarted","Data":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.254025 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.346609 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdgnt\" (UniqueName: \"kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt\") pod \"de888787-cfa6-46be-bb5b-a45e06eddb1b\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.346736 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts\") pod \"de888787-cfa6-46be-bb5b-a45e06eddb1b\" (UID: \"de888787-cfa6-46be-bb5b-a45e06eddb1b\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.347431 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de888787-cfa6-46be-bb5b-a45e06eddb1b" (UID: "de888787-cfa6-46be-bb5b-a45e06eddb1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.348067 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de888787-cfa6-46be-bb5b-a45e06eddb1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.355331 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt" (OuterVolumeSpecName: "kube-api-access-sdgnt") pod "de888787-cfa6-46be-bb5b-a45e06eddb1b" (UID: "de888787-cfa6-46be-bb5b-a45e06eddb1b"). InnerVolumeSpecName "kube-api-access-sdgnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.452803 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdgnt\" (UniqueName: \"kubernetes.io/projected/de888787-cfa6-46be-bb5b-a45e06eddb1b-kube-api-access-sdgnt\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.658668 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.683713 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.724020 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.746180 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.757713 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts\") pod \"7a302eb7-0f61-497d-96df-59aacc8d463f\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.757789 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts\") pod \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.757885 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnzrq\" (UniqueName: \"kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq\") pod \"7a302eb7-0f61-497d-96df-59aacc8d463f\" (UID: \"7a302eb7-0f61-497d-96df-59aacc8d463f\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.757937 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfcpn\" (UniqueName: \"kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn\") pod \"7d40f23c-fece-48e9-a70f-7b1309600baa\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.758008 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts\") pod \"7d40f23c-fece-48e9-a70f-7b1309600baa\" (UID: \"7d40f23c-fece-48e9-a70f-7b1309600baa\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.758063 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qgcm\" (UniqueName: \"kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm\") pod \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\" (UID: \"d301c5c5-c1cf-4037-aafb-612fbbe133f7\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.758666 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d301c5c5-c1cf-4037-aafb-612fbbe133f7" (UID: "d301c5c5-c1cf-4037-aafb-612fbbe133f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.759666 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a302eb7-0f61-497d-96df-59aacc8d463f" (UID: "7a302eb7-0f61-497d-96df-59aacc8d463f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.760869 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d40f23c-fece-48e9-a70f-7b1309600baa" (UID: "7d40f23c-fece-48e9-a70f-7b1309600baa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.765922 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq" (OuterVolumeSpecName: "kube-api-access-tnzrq") pod "7a302eb7-0f61-497d-96df-59aacc8d463f" (UID: "7a302eb7-0f61-497d-96df-59aacc8d463f"). InnerVolumeSpecName "kube-api-access-tnzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.765970 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn" (OuterVolumeSpecName: "kube-api-access-kfcpn") pod "7d40f23c-fece-48e9-a70f-7b1309600baa" (UID: "7d40f23c-fece-48e9-a70f-7b1309600baa"). InnerVolumeSpecName "kube-api-access-kfcpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.767990 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm" (OuterVolumeSpecName: "kube-api-access-5qgcm") pod "d301c5c5-c1cf-4037-aafb-612fbbe133f7" (UID: "d301c5c5-c1cf-4037-aafb-612fbbe133f7"). InnerVolumeSpecName "kube-api-access-5qgcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.785135 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bzpsk" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.785094 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bzpsk" event={"ID":"de888787-cfa6-46be-bb5b-a45e06eddb1b","Type":"ContainerDied","Data":"6d9565a39948f1bc2c157942dfeb4a7770ce4037503f9932904f1c1c2015f1c4"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.785253 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9565a39948f1bc2c157942dfeb4a7770ce4037503f9932904f1c1c2015f1c4" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.786995 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-409c-account-create-update-qhm94" event={"ID":"d301c5c5-c1cf-4037-aafb-612fbbe133f7","Type":"ContainerDied","Data":"41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.787022 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-qhm94" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.787034 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b5124ce4803db91c7ef248e9cc19b27260e38ea645226aaf859d728a931ab4" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.788441 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s2smr" event={"ID":"03137eb0-6a57-4dc2-91aa-e7af80abbd22","Type":"ContainerDied","Data":"34deb1738ad27aacb1bf83653f4b32681d6750d1b187c50b02bf3fa5d66457ed"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.788476 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34deb1738ad27aacb1bf83653f4b32681d6750d1b187c50b02bf3fa5d66457ed" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.788540 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s2smr" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.789831 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" event={"ID":"7a302eb7-0f61-497d-96df-59aacc8d463f","Type":"ContainerDied","Data":"e011b5f7939fb9a5fbc3d780c7d87c51184687eb173cab950fa247dbeb26a92f"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.789857 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e011b5f7939fb9a5fbc3d780c7d87c51184687eb173cab950fa247dbeb26a92f" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.789878 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-b75kg" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.792535 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" event={"ID":"7d40f23c-fece-48e9-a70f-7b1309600baa","Type":"ContainerDied","Data":"8bed3f68519633648f8f64bd3c401cab3bdd461acca5d5a74ae800a8d05d90ad"} Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.792595 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bed3f68519633648f8f64bd3c401cab3bdd461acca5d5a74ae800a8d05d90ad" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.792809 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-hcnbj" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.861735 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts\") pod \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.862460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03137eb0-6a57-4dc2-91aa-e7af80abbd22" (UID: "03137eb0-6a57-4dc2-91aa-e7af80abbd22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.862782 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qkwc\" (UniqueName: \"kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc\") pod \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\" (UID: \"03137eb0-6a57-4dc2-91aa-e7af80abbd22\") " Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.865945 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnzrq\" (UniqueName: \"kubernetes.io/projected/7a302eb7-0f61-497d-96df-59aacc8d463f-kube-api-access-tnzrq\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866054 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03137eb0-6a57-4dc2-91aa-e7af80abbd22-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866188 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfcpn\" (UniqueName: \"kubernetes.io/projected/7d40f23c-fece-48e9-a70f-7b1309600baa-kube-api-access-kfcpn\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866211 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d40f23c-fece-48e9-a70f-7b1309600baa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866222 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qgcm\" (UniqueName: \"kubernetes.io/projected/d301c5c5-c1cf-4037-aafb-612fbbe133f7-kube-api-access-5qgcm\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866233 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a302eb7-0f61-497d-96df-59aacc8d463f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866243 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d301c5c5-c1cf-4037-aafb-612fbbe133f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.866450 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc" (OuterVolumeSpecName: "kube-api-access-9qkwc") pod "03137eb0-6a57-4dc2-91aa-e7af80abbd22" (UID: "03137eb0-6a57-4dc2-91aa-e7af80abbd22"). InnerVolumeSpecName "kube-api-access-9qkwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:47 crc kubenswrapper[4939]: I0318 16:01:47.968329 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qkwc\" (UniqueName: \"kubernetes.io/projected/03137eb0-6a57-4dc2-91aa-e7af80abbd22-kube-api-access-9qkwc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:48 crc kubenswrapper[4939]: I0318 16:01:48.803388 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerStarted","Data":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.656857 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713405 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713550 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpknf\" (UniqueName: \"kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713603 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713662 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713706 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713810 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.713889 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id\") pod \"85f29b4f-1e64-4443-90f2-02163b49bcd0\" (UID: \"85f29b4f-1e64-4443-90f2-02163b49bcd0\") " Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.714442 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.714792 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs" (OuterVolumeSpecName: "logs") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.721996 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf" (OuterVolumeSpecName: "kube-api-access-cpknf") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "kube-api-access-cpknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.723654 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.746082 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts" (OuterVolumeSpecName: "scripts") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.747805 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.797634 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data" (OuterVolumeSpecName: "config-data") pod "85f29b4f-1e64-4443-90f2-02163b49bcd0" (UID: "85f29b4f-1e64-4443-90f2-02163b49bcd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815296 4939 generic.go:334] "Generic (PLEG): container finished" podID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerID="a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370" exitCode=137 Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815335 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerDied","Data":"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370"} Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815359 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85f29b4f-1e64-4443-90f2-02163b49bcd0","Type":"ContainerDied","Data":"eeedb0814b51f0fcda714ea70ac9069f9037b4d9365c59e0d0e3ce823d4631c7"} Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815374 4939 scope.go:117] "RemoveContainer" containerID="a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815473 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815546 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815565 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpknf\" (UniqueName: \"kubernetes.io/projected/85f29b4f-1e64-4443-90f2-02163b49bcd0-kube-api-access-cpknf\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815577 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f29b4f-1e64-4443-90f2-02163b49bcd0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.815656 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.816324 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.816355 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f29b4f-1e64-4443-90f2-02163b49bcd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.816364 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85f29b4f-1e64-4443-90f2-02163b49bcd0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.879948 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.885203 4939 scope.go:117] "RemoveContainer" containerID="3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.894553 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.919803 4939 scope.go:117] "RemoveContainer" containerID="a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.920352 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370\": container with ID starting with a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370 not found: ID does not exist" containerID="a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.920388 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370"} err="failed to get container status \"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370\": rpc error: code = NotFound desc = could not find container \"a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370\": container with ID starting with a274b31d5f19249bfdf0f699cfc962322adedb73ba32e30b899707873bd41370 not found: ID does not exist" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.920414 4939 scope.go:117] "RemoveContainer" containerID="3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.920806 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372\": container with ID starting with 3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372 not found: ID does not exist" containerID="3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.920848 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372"} err="failed to get container status \"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372\": rpc error: code = NotFound desc = could not find container \"3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372\": container with ID starting with 3e6354013ff04c33c29a4c992365ac27148c32982ff1ffac6660fb0635223372 not found: ID does not exist" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927359 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927809 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927831 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927846 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0eb0f89-9573-4e53-a22c-16b8cd80140a" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927852 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0eb0f89-9573-4e53-a22c-16b8cd80140a" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927868 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d40f23c-fece-48e9-a70f-7b1309600baa" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927876 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d40f23c-fece-48e9-a70f-7b1309600baa" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927889 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03137eb0-6a57-4dc2-91aa-e7af80abbd22" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927895 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="03137eb0-6a57-4dc2-91aa-e7af80abbd22" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927909 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api-log" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927915 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api-log" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927929 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a302eb7-0f61-497d-96df-59aacc8d463f" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927934 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a302eb7-0f61-497d-96df-59aacc8d463f" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927945 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de888787-cfa6-46be-bb5b-a45e06eddb1b" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927951 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="de888787-cfa6-46be-bb5b-a45e06eddb1b" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: E0318 16:01:49.927967 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d301c5c5-c1cf-4037-aafb-612fbbe133f7" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.927972 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d301c5c5-c1cf-4037-aafb-612fbbe133f7" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928168 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d40f23c-fece-48e9-a70f-7b1309600baa" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928185 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928195 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0eb0f89-9573-4e53-a22c-16b8cd80140a" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928201 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a302eb7-0f61-497d-96df-59aacc8d463f" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928210 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="03137eb0-6a57-4dc2-91aa-e7af80abbd22" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928224 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d301c5c5-c1cf-4037-aafb-612fbbe133f7" containerName="mariadb-account-create-update" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928232 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" containerName="cinder-api-log" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.928239 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="de888787-cfa6-46be-bb5b-a45e06eddb1b" containerName="mariadb-database-create" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.929334 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.931367 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.933085 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.933266 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 16:01:49 crc kubenswrapper[4939]: I0318 16:01:49.949424 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021482 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89khq\" (UniqueName: \"kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021549 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021587 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021633 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021654 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021689 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021718 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021735 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.021764 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123412 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123787 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123838 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123889 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123955 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.123980 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89khq\" (UniqueName: \"kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.124016 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.124062 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.128154 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.130742 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.139139 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.147640 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.148632 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.150059 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.150601 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.151566 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f29b4f-1e64-4443-90f2-02163b49bcd0" path="/var/lib/kubelet/pods/85f29b4f-1e64-4443-90f2-02163b49bcd0/volumes" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.155039 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89khq\" (UniqueName: \"kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.157006 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts\") pod \"cinder-api-0\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.253113 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.731533 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.747822 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.828329 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerStarted","Data":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.829372 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.831440 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerStarted","Data":"198a66caefc306f6c725fa20dadae0d7c24284e28fa124b4656b6d622bddba95"} Mar 18 16:01:50 crc kubenswrapper[4939]: I0318 16:01:50.858119 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.374813943 podStartE2EDuration="12.858102333s" podCreationTimestamp="2026-03-18 16:01:38 +0000 UTC" firstStartedPulling="2026-03-18 16:01:44.761744644 +0000 UTC m=+1469.360932265" lastFinishedPulling="2026-03-18 16:01:50.245033034 +0000 UTC m=+1474.844220655" observedRunningTime="2026-03-18 16:01:50.849847058 +0000 UTC m=+1475.449034709" watchObservedRunningTime="2026-03-18 16:01:50.858102333 +0000 UTC m=+1475.457289944" Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.774284 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.884851 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-central-agent" containerID="cri-o://39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" gracePeriod=30 Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.885043 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerStarted","Data":"aba281f036d2b5e4f340ee24dde5ab36bf1f763f7ca8377787fe8cdc8c7f40da"} Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.885857 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="proxy-httpd" containerID="cri-o://0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" gracePeriod=30 Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.885931 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-notification-agent" containerID="cri-o://63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" gracePeriod=30 Mar 18 16:01:51 crc kubenswrapper[4939]: I0318 16:01:51.885950 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="sg-core" containerID="cri-o://2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" gracePeriod=30 Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.637323 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-98ww2"] Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.642157 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.644282 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.644579 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.644615 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f6bpf" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.648320 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-98ww2"] Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.680933 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.681239 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.681319 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xr89\" (UniqueName: \"kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.681431 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.741395 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782319 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782391 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782480 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782626 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782755 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782778 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbmzc\" (UniqueName: \"kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.782825 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml\") pod \"bcc3d87c-de81-4462-a286-5142f5434632\" (UID: \"bcc3d87c-de81-4462-a286-5142f5434632\") " Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.783027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.783102 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.783121 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xr89\" (UniqueName: \"kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.783151 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.783922 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.784176 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.788558 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts" (OuterVolumeSpecName: "scripts") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.789786 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc" (OuterVolumeSpecName: "kube-api-access-tbmzc") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "kube-api-access-tbmzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.790230 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.791471 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.799530 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.804388 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xr89\" (UniqueName: \"kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89\") pod \"nova-cell0-conductor-db-sync-98ww2\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.829673 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.880374 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.888697 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.888819 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbmzc\" (UniqueName: \"kubernetes.io/projected/bcc3d87c-de81-4462-a286-5142f5434632-kube-api-access-tbmzc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.888878 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.888930 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc3d87c-de81-4462-a286-5142f5434632-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.888999 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.889539 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898390 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcc3d87c-de81-4462-a286-5142f5434632" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" exitCode=0 Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898428 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcc3d87c-de81-4462-a286-5142f5434632" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" exitCode=2 Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898435 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcc3d87c-de81-4462-a286-5142f5434632" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" exitCode=0 Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898442 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcc3d87c-de81-4462-a286-5142f5434632" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" exitCode=0 Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898477 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerDied","Data":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898521 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerDied","Data":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898533 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerDied","Data":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898542 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerDied","Data":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898551 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc3d87c-de81-4462-a286-5142f5434632","Type":"ContainerDied","Data":"4d05ad71751b6ecc6e9fe02073e66d453d558acbbec6ff5132aff4f9d997aef5"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898565 4939 scope.go:117] "RemoveContainer" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.898479 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.905803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerStarted","Data":"ce5cff00a735d1a1ef2cc4c115a9ac8265f9778ce6a2da6ac270a33f9cc7daf5"} Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.906079 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.909266 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data" (OuterVolumeSpecName: "config-data") pod "bcc3d87c-de81-4462-a286-5142f5434632" (UID: "bcc3d87c-de81-4462-a286-5142f5434632"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.930670 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.93065006 podStartE2EDuration="3.93065006s" podCreationTimestamp="2026-03-18 16:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:52.926158705 +0000 UTC m=+1477.525346336" watchObservedRunningTime="2026-03-18 16:01:52.93065006 +0000 UTC m=+1477.529837681" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.960088 4939 scope.go:117] "RemoveContainer" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.986510 4939 scope.go:117] "RemoveContainer" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:52 crc kubenswrapper[4939]: I0318 16:01:52.992356 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc3d87c-de81-4462-a286-5142f5434632-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.007599 4939 scope.go:117] "RemoveContainer" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.026643 4939 scope.go:117] "RemoveContainer" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.027127 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": container with ID starting with 0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f not found: ID does not exist" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.027182 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} err="failed to get container status \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": rpc error: code = NotFound desc = could not find container \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": container with ID starting with 0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.027207 4939 scope.go:117] "RemoveContainer" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.027734 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": container with ID starting with 2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be not found: ID does not exist" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.027765 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} err="failed to get container status \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": rpc error: code = NotFound desc = could not find container \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": container with ID starting with 2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.027788 4939 scope.go:117] "RemoveContainer" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.028064 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": container with ID starting with 63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025 not found: ID does not exist" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028110 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} err="failed to get container status \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": rpc error: code = NotFound desc = could not find container \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": container with ID starting with 63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028139 4939 scope.go:117] "RemoveContainer" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.028474 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": container with ID starting with 39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7 not found: ID does not exist" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028519 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} err="failed to get container status \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": rpc error: code = NotFound desc = could not find container \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": container with ID starting with 39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028538 4939 scope.go:117] "RemoveContainer" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028838 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} err="failed to get container status \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": rpc error: code = NotFound desc = could not find container \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": container with ID starting with 0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.028856 4939 scope.go:117] "RemoveContainer" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029072 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} err="failed to get container status \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": rpc error: code = NotFound desc = could not find container \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": container with ID starting with 2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029145 4939 scope.go:117] "RemoveContainer" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029412 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} err="failed to get container status \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": rpc error: code = NotFound desc = could not find container \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": container with ID starting with 63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029430 4939 scope.go:117] "RemoveContainer" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029637 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} err="failed to get container status \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": rpc error: code = NotFound desc = could not find container \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": container with ID starting with 39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029709 4939 scope.go:117] "RemoveContainer" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029978 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} err="failed to get container status \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": rpc error: code = NotFound desc = could not find container \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": container with ID starting with 0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.029996 4939 scope.go:117] "RemoveContainer" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.030183 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} err="failed to get container status \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": rpc error: code = NotFound desc = could not find container \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": container with ID starting with 2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.030292 4939 scope.go:117] "RemoveContainer" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.030684 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} err="failed to get container status \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": rpc error: code = NotFound desc = could not find container \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": container with ID starting with 63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.030701 4939 scope.go:117] "RemoveContainer" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.030904 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} err="failed to get container status \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": rpc error: code = NotFound desc = could not find container \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": container with ID starting with 39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031021 4939 scope.go:117] "RemoveContainer" containerID="0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031459 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f"} err="failed to get container status \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": rpc error: code = NotFound desc = could not find container \"0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f\": container with ID starting with 0bcfc4dd93361e5fb88eaa202dbaec4345623f76828956a8550cbe8358d45c8f not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031487 4939 scope.go:117] "RemoveContainer" containerID="2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031733 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be"} err="failed to get container status \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": rpc error: code = NotFound desc = could not find container \"2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be\": container with ID starting with 2b4692ba568248c952b4f43581a026e023fb6a0d295dc9366b7b20177402b4be not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031760 4939 scope.go:117] "RemoveContainer" containerID="63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.031994 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025"} err="failed to get container status \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": rpc error: code = NotFound desc = could not find container \"63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025\": container with ID starting with 63b775372d7f7787d550b7e34f691273afb91873416896d1155424f23055f025 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.032099 4939 scope.go:117] "RemoveContainer" containerID="39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.032424 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7"} err="failed to get container status \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": rpc error: code = NotFound desc = could not find container \"39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7\": container with ID starting with 39bc9c0c3c801b438210e1e5b8b0ec5af973b7cd5ad7f801240abb97ed3a02c7 not found: ID does not exist" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.045104 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.253410 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.266838 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.272886 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.273319 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-central-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.273408 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-central-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.273463 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-notification-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.273526 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-notification-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.273608 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="sg-core" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.273661 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="sg-core" Mar 18 16:01:53 crc kubenswrapper[4939]: E0318 16:01:53.273714 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="proxy-httpd" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.273761 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="proxy-httpd" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.273977 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="proxy-httpd" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.274052 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="sg-core" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.274118 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-central-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.274206 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc3d87c-de81-4462-a286-5142f5434632" containerName="ceilometer-notification-agent" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.278296 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.282318 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.282584 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.287586 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427530 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427618 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427720 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427758 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427782 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chnzp\" (UniqueName: \"kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.427799 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529415 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529523 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnzp\" (UniqueName: \"kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529612 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529701 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.529790 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.532104 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-98ww2"] Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.532425 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.532937 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.536274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.539774 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.540973 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.541307 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.557271 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnzp\" (UniqueName: \"kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp\") pod \"ceilometer-0\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.607230 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.687578 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.687626 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.687663 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.688498 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.688566 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61" gracePeriod=600 Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.920053 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-98ww2" event={"ID":"23191f5f-fe02-4b74-ab9c-95b03d308980","Type":"ContainerStarted","Data":"d1c170413f861b0654cf4b49f0809c974118e1b0dda7282f2e6c6c35961ac542"} Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.924909 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61" exitCode=0 Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.924975 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61"} Mar 18 16:01:53 crc kubenswrapper[4939]: I0318 16:01:53.925036 4939 scope.go:117] "RemoveContainer" containerID="9b2c0c563781371d0ba976cd9bb94c7c42e1c32c9519396b89c4802c0c2c6efa" Mar 18 16:01:54 crc kubenswrapper[4939]: I0318 16:01:54.104751 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:54 crc kubenswrapper[4939]: W0318 16:01:54.113183 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c9933e1_34e7_493d_8e6c_7c2f317b2bea.slice/crio-6e563db61c121029ef8f19546e91bfe526af8604349a5faa75cf5713478252d0 WatchSource:0}: Error finding container 6e563db61c121029ef8f19546e91bfe526af8604349a5faa75cf5713478252d0: Status 404 returned error can't find the container with id 6e563db61c121029ef8f19546e91bfe526af8604349a5faa75cf5713478252d0 Mar 18 16:01:54 crc kubenswrapper[4939]: I0318 16:01:54.146527 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc3d87c-de81-4462-a286-5142f5434632" path="/var/lib/kubelet/pods/bcc3d87c-de81-4462-a286-5142f5434632/volumes" Mar 18 16:01:54 crc kubenswrapper[4939]: I0318 16:01:54.938654 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd"} Mar 18 16:01:54 crc kubenswrapper[4939]: I0318 16:01:54.940313 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerStarted","Data":"bf6b73a54624d929f8384d8d28586b7e110ae2e005d3b0e33aec226c2e704a1c"} Mar 18 16:01:54 crc kubenswrapper[4939]: I0318 16:01:54.940336 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerStarted","Data":"6e563db61c121029ef8f19546e91bfe526af8604349a5faa75cf5713478252d0"} Mar 18 16:01:55 crc kubenswrapper[4939]: I0318 16:01:55.656300 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:55 crc kubenswrapper[4939]: I0318 16:01:55.950252 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerStarted","Data":"f0b8449d93a90b893e475b1ef20eb751d717cae07a9a1c9aaa7076486a4a8094"} Mar 18 16:01:56 crc kubenswrapper[4939]: I0318 16:01:56.966596 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerStarted","Data":"0e2146cc8977d8698c353eb075623269c19f9a2eca99422f79fc7689b45b5d51"} Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.603362 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.687681 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-576956754b-kspq2" Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.743451 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.743726 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d598cf76-pjrv7" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-log" containerID="cri-o://158535dbdc0da716361bdda232739bce8feebc1b0c21fc44eafb03d99278d41a" gracePeriod=30 Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.744090 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d598cf76-pjrv7" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-api" containerID="cri-o://db9387db45857f14d1bc75444606939dc75c93442d1661ce4ad987885fad4deb" gracePeriod=30 Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.995691 4939 generic.go:334] "Generic (PLEG): container finished" podID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerID="158535dbdc0da716361bdda232739bce8feebc1b0c21fc44eafb03d99278d41a" exitCode=143 Mar 18 16:01:58 crc kubenswrapper[4939]: I0318 16:01:58.995768 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerDied","Data":"158535dbdc0da716361bdda232739bce8feebc1b0c21fc44eafb03d99278d41a"} Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.152072 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564162-cz5hj"] Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.153324 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-cz5hj"] Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.153395 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.179312 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.179380 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.179496 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.281764 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smc5z\" (UniqueName: \"kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z\") pod \"auto-csr-approver-29564162-cz5hj\" (UID: \"f4a582e2-2e9d-4499-8c53-04b7d5de9704\") " pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.383724 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smc5z\" (UniqueName: \"kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z\") pod \"auto-csr-approver-29564162-cz5hj\" (UID: \"f4a582e2-2e9d-4499-8c53-04b7d5de9704\") " pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.424364 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smc5z\" (UniqueName: \"kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z\") pod \"auto-csr-approver-29564162-cz5hj\" (UID: \"f4a582e2-2e9d-4499-8c53-04b7d5de9704\") " pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:00 crc kubenswrapper[4939]: I0318 16:02:00.498006 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:02 crc kubenswrapper[4939]: I0318 16:02:02.551286 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 16:02:03 crc kubenswrapper[4939]: I0318 16:02:03.058418 4939 generic.go:334] "Generic (PLEG): container finished" podID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerID="db9387db45857f14d1bc75444606939dc75c93442d1661ce4ad987885fad4deb" exitCode=0 Mar 18 16:02:03 crc kubenswrapper[4939]: I0318 16:02:03.058469 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerDied","Data":"db9387db45857f14d1bc75444606939dc75c93442d1661ce4ad987885fad4deb"} Mar 18 16:02:05 crc kubenswrapper[4939]: I0318 16:02:05.960659 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.084178 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d598cf76-pjrv7" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.084177 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d598cf76-pjrv7" event={"ID":"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd","Type":"ContainerDied","Data":"9d864febb00fc16986b1d08e31b5411fba9a9590343d630fe7c8a2c9fe30c1f2"} Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.084277 4939 scope.go:117] "RemoveContainer" containerID="db9387db45857f14d1bc75444606939dc75c93442d1661ce4ad987885fad4deb" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.085979 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-98ww2" event={"ID":"23191f5f-fe02-4b74-ab9c-95b03d308980","Type":"ContainerStarted","Data":"ffa557caefa00ed451e073426ac3014e675a404548733cc936435b5abb184d29"} Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.094973 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-central-agent" containerID="cri-o://bf6b73a54624d929f8384d8d28586b7e110ae2e005d3b0e33aec226c2e704a1c" gracePeriod=30 Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.095039 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="sg-core" containerID="cri-o://0e2146cc8977d8698c353eb075623269c19f9a2eca99422f79fc7689b45b5d51" gracePeriod=30 Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.095058 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="proxy-httpd" containerID="cri-o://be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17" gracePeriod=30 Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.094953 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerStarted","Data":"be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17"} Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.095168 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.095068 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-notification-agent" containerID="cri-o://f0b8449d93a90b893e475b1ef20eb751d717cae07a9a1c9aaa7076486a4a8094" gracePeriod=30 Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.103554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-98ww2" podStartSLOduration=1.8441777 podStartE2EDuration="14.103536127s" podCreationTimestamp="2026-03-18 16:01:52 +0000 UTC" firstStartedPulling="2026-03-18 16:01:53.542125837 +0000 UTC m=+1478.141313458" lastFinishedPulling="2026-03-18 16:02:05.801484264 +0000 UTC m=+1490.400671885" observedRunningTime="2026-03-18 16:02:06.102345094 +0000 UTC m=+1490.701532715" watchObservedRunningTime="2026-03-18 16:02:06.103536127 +0000 UTC m=+1490.702723738" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109107 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rmw\" (UniqueName: \"kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109221 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109296 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109315 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109417 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.109459 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle\") pod \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\" (UID: \"f43ffb4c-72ee-45b3-958a-5e6a0f079fcd\") " Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.110227 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs" (OuterVolumeSpecName: "logs") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.115694 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw" (OuterVolumeSpecName: "kube-api-access-p2rmw") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "kube-api-access-p2rmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.117643 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts" (OuterVolumeSpecName: "scripts") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.117688 4939 scope.go:117] "RemoveContainer" containerID="158535dbdc0da716361bdda232739bce8feebc1b0c21fc44eafb03d99278d41a" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.155548 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4731670559999999 podStartE2EDuration="13.155525718s" podCreationTimestamp="2026-03-18 16:01:53 +0000 UTC" firstStartedPulling="2026-03-18 16:01:54.1165352 +0000 UTC m=+1478.715722821" lastFinishedPulling="2026-03-18 16:02:05.798893862 +0000 UTC m=+1490.398081483" observedRunningTime="2026-03-18 16:02:06.121976892 +0000 UTC m=+1490.721164513" watchObservedRunningTime="2026-03-18 16:02:06.155525718 +0000 UTC m=+1490.754713329" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.173670 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-cz5hj"] Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.179693 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data" (OuterVolumeSpecName: "config-data") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.184599 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.217855 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.217899 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rmw\" (UniqueName: \"kubernetes.io/projected/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-kube-api-access-p2rmw\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.217952 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.217997 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.218012 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.250811 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.261179 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" (UID: "f43ffb4c-72ee-45b3-958a-5e6a0f079fcd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.320007 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.320253 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.478414 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:02:06 crc kubenswrapper[4939]: I0318 16:02:06.486602 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5d598cf76-pjrv7"] Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.110026 4939 generic.go:334] "Generic (PLEG): container finished" podID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerID="0e2146cc8977d8698c353eb075623269c19f9a2eca99422f79fc7689b45b5d51" exitCode=2 Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.113311 4939 generic.go:334] "Generic (PLEG): container finished" podID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerID="f0b8449d93a90b893e475b1ef20eb751d717cae07a9a1c9aaa7076486a4a8094" exitCode=0 Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.113482 4939 generic.go:334] "Generic (PLEG): container finished" podID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerID="bf6b73a54624d929f8384d8d28586b7e110ae2e005d3b0e33aec226c2e704a1c" exitCode=0 Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.110546 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerDied","Data":"0e2146cc8977d8698c353eb075623269c19f9a2eca99422f79fc7689b45b5d51"} Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.113872 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerDied","Data":"f0b8449d93a90b893e475b1ef20eb751d717cae07a9a1c9aaa7076486a4a8094"} Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.114018 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerDied","Data":"bf6b73a54624d929f8384d8d28586b7e110ae2e005d3b0e33aec226c2e704a1c"} Mar 18 16:02:07 crc kubenswrapper[4939]: I0318 16:02:07.116040 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" event={"ID":"f4a582e2-2e9d-4499-8c53-04b7d5de9704","Type":"ContainerStarted","Data":"d7af1a711b683f38b4b8556877b3576853f1f753d3443e477e9308f5c48d0aaf"} Mar 18 16:02:08 crc kubenswrapper[4939]: I0318 16:02:08.144726 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" path="/var/lib/kubelet/pods/f43ffb4c-72ee-45b3-958a-5e6a0f079fcd/volumes" Mar 18 16:02:09 crc kubenswrapper[4939]: I0318 16:02:09.135436 4939 generic.go:334] "Generic (PLEG): container finished" podID="f4a582e2-2e9d-4499-8c53-04b7d5de9704" containerID="807f71a2590b251e7c5b9dfc7934283a4894cf19375dc95bfe6ab5ab6f25a1f4" exitCode=0 Mar 18 16:02:09 crc kubenswrapper[4939]: I0318 16:02:09.135496 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" event={"ID":"f4a582e2-2e9d-4499-8c53-04b7d5de9704","Type":"ContainerDied","Data":"807f71a2590b251e7c5b9dfc7934283a4894cf19375dc95bfe6ab5ab6f25a1f4"} Mar 18 16:02:10 crc kubenswrapper[4939]: I0318 16:02:10.508738 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:10 crc kubenswrapper[4939]: I0318 16:02:10.598340 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smc5z\" (UniqueName: \"kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z\") pod \"f4a582e2-2e9d-4499-8c53-04b7d5de9704\" (UID: \"f4a582e2-2e9d-4499-8c53-04b7d5de9704\") " Mar 18 16:02:10 crc kubenswrapper[4939]: I0318 16:02:10.605244 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z" (OuterVolumeSpecName: "kube-api-access-smc5z") pod "f4a582e2-2e9d-4499-8c53-04b7d5de9704" (UID: "f4a582e2-2e9d-4499-8c53-04b7d5de9704"). InnerVolumeSpecName "kube-api-access-smc5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:10 crc kubenswrapper[4939]: I0318 16:02:10.700554 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smc5z\" (UniqueName: \"kubernetes.io/projected/f4a582e2-2e9d-4499-8c53-04b7d5de9704-kube-api-access-smc5z\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:11 crc kubenswrapper[4939]: I0318 16:02:11.155084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" event={"ID":"f4a582e2-2e9d-4499-8c53-04b7d5de9704","Type":"ContainerDied","Data":"d7af1a711b683f38b4b8556877b3576853f1f753d3443e477e9308f5c48d0aaf"} Mar 18 16:02:11 crc kubenswrapper[4939]: I0318 16:02:11.155404 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7af1a711b683f38b4b8556877b3576853f1f753d3443e477e9308f5c48d0aaf" Mar 18 16:02:11 crc kubenswrapper[4939]: I0318 16:02:11.155152 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-cz5hj" Mar 18 16:02:11 crc kubenswrapper[4939]: I0318 16:02:11.577253 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8zqqn"] Mar 18 16:02:11 crc kubenswrapper[4939]: I0318 16:02:11.586112 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-8zqqn"] Mar 18 16:02:12 crc kubenswrapper[4939]: I0318 16:02:12.144632 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489dd963-1de2-4977-9d76-d9635ee4dc24" path="/var/lib/kubelet/pods/489dd963-1de2-4977-9d76-d9635ee4dc24/volumes" Mar 18 16:02:15 crc kubenswrapper[4939]: I0318 16:02:15.593692 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:15 crc kubenswrapper[4939]: I0318 16:02:15.594732 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-log" containerID="cri-o://3a58bac812ff61d927ca864d82d746ab03ccf821635ee32714d7731a006b6367" gracePeriod=30 Mar 18 16:02:15 crc kubenswrapper[4939]: I0318 16:02:15.595247 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-httpd" containerID="cri-o://6a1240c4afc05c1dc8d7f58258c3a1656a859c201e52d3ef0f5ba0aaf40645bd" gracePeriod=30 Mar 18 16:02:16 crc kubenswrapper[4939]: I0318 16:02:16.207554 4939 generic.go:334] "Generic (PLEG): container finished" podID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerID="3a58bac812ff61d927ca864d82d746ab03ccf821635ee32714d7731a006b6367" exitCode=143 Mar 18 16:02:16 crc kubenswrapper[4939]: I0318 16:02:16.207642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerDied","Data":"3a58bac812ff61d927ca864d82d746ab03ccf821635ee32714d7731a006b6367"} Mar 18 16:02:16 crc kubenswrapper[4939]: I0318 16:02:16.683934 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:16 crc kubenswrapper[4939]: I0318 16:02:16.684162 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-log" containerID="cri-o://0c7fe9c4155b4831bd03228350c672b9c68571f96276cee2f3bba62cba25d888" gracePeriod=30 Mar 18 16:02:16 crc kubenswrapper[4939]: I0318 16:02:16.684225 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-httpd" containerID="cri-o://63e00c85d4fcd04f40420b32206738e70a2b26b69d681fd823a84fb57bf3b4bd" gracePeriod=30 Mar 18 16:02:17 crc kubenswrapper[4939]: I0318 16:02:17.218030 4939 generic.go:334] "Generic (PLEG): container finished" podID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerID="0c7fe9c4155b4831bd03228350c672b9c68571f96276cee2f3bba62cba25d888" exitCode=143 Mar 18 16:02:17 crc kubenswrapper[4939]: I0318 16:02:17.218111 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerDied","Data":"0c7fe9c4155b4831bd03228350c672b9c68571f96276cee2f3bba62cba25d888"} Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.236896 4939 generic.go:334] "Generic (PLEG): container finished" podID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerID="6a1240c4afc05c1dc8d7f58258c3a1656a859c201e52d3ef0f5ba0aaf40645bd" exitCode=0 Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.236992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerDied","Data":"6a1240c4afc05c1dc8d7f58258c3a1656a859c201e52d3ef0f5ba0aaf40645bd"} Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.237191 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b37a35fa-ce56-4934-88eb-fd24cc5aec4f","Type":"ContainerDied","Data":"4bf53d636b6076656173675261a0a77c047c311b5a77976f2375168d137764c2"} Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.237205 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf53d636b6076656173675261a0a77c047c311b5a77976f2375168d137764c2" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.261342 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364165 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364263 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wsmv\" (UniqueName: \"kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364352 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364374 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364396 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364469 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364495 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.364563 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle\") pod \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\" (UID: \"b37a35fa-ce56-4934-88eb-fd24cc5aec4f\") " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.365459 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.365573 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs" (OuterVolumeSpecName: "logs") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.372675 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv" (OuterVolumeSpecName: "kube-api-access-7wsmv") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "kube-api-access-7wsmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.374368 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts" (OuterVolumeSpecName: "scripts") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.376074 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.427326 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.434343 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.436350 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data" (OuterVolumeSpecName: "config-data") pod "b37a35fa-ce56-4934-88eb-fd24cc5aec4f" (UID: "b37a35fa-ce56-4934-88eb-fd24cc5aec4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466427 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466490 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466511 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466523 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466539 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wsmv\" (UniqueName: \"kubernetes.io/projected/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-kube-api-access-7wsmv\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466549 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466557 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37a35fa-ce56-4934-88eb-fd24cc5aec4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.466581 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.484413 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.568298 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.829962 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:48588->10.217.0.155:9292: read: connection reset by peer" Mar 18 16:02:19 crc kubenswrapper[4939]: I0318 16:02:19.830172 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:48592->10.217.0.155:9292: read: connection reset by peer" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.248045 4939 generic.go:334] "Generic (PLEG): container finished" podID="23191f5f-fe02-4b74-ab9c-95b03d308980" containerID="ffa557caefa00ed451e073426ac3014e675a404548733cc936435b5abb184d29" exitCode=0 Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.248188 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-98ww2" event={"ID":"23191f5f-fe02-4b74-ab9c-95b03d308980","Type":"ContainerDied","Data":"ffa557caefa00ed451e073426ac3014e675a404548733cc936435b5abb184d29"} Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.251955 4939 generic.go:334] "Generic (PLEG): container finished" podID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerID="63e00c85d4fcd04f40420b32206738e70a2b26b69d681fd823a84fb57bf3b4bd" exitCode=0 Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.252005 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerDied","Data":"63e00c85d4fcd04f40420b32206738e70a2b26b69d681fd823a84fb57bf3b4bd"} Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.252037 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.338795 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.351488 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.359126 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.376927 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377422 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377447 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-log" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377478 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377491 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377600 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377612 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377622 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a582e2-2e9d-4499-8c53-04b7d5de9704" containerName="oc" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377628 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a582e2-2e9d-4499-8c53-04b7d5de9704" containerName="oc" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377643 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377651 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377672 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377680 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: E0318 16:02:20.377689 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-api" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377696 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-api" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377919 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-api" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377939 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377949 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-httpd" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377976 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43ffb4c-72ee-45b3-958a-5e6a0f079fcd" containerName="placement-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.377990 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.378000 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a582e2-2e9d-4499-8c53-04b7d5de9704" containerName="oc" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.378019 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" containerName="glance-log" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.379222 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382335 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382455 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382586 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382681 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382724 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzgj\" (UniqueName: \"kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382806 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.382978 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.383010 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5bb7097f-d7d3-499e-b412-55f5946b5be4\" (UID: \"5bb7097f-d7d3-499e-b412-55f5946b5be4\") " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.383266 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs" (OuterVolumeSpecName: "logs") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.383743 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.386242 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.405295 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj" (OuterVolumeSpecName: "kube-api-access-cmzgj") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "kube-api-access-cmzgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.405575 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts" (OuterVolumeSpecName: "scripts") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.411144 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.419626 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.453118 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.472587 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.477964 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data" (OuterVolumeSpecName: "config-data") pod "5bb7097f-d7d3-499e-b412-55f5946b5be4" (UID: "5bb7097f-d7d3-499e-b412-55f5946b5be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.484954 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485027 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485058 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485095 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdp57\" (UniqueName: \"kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485154 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485185 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485201 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485234 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485282 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485293 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzgj\" (UniqueName: \"kubernetes.io/projected/5bb7097f-d7d3-499e-b412-55f5946b5be4-kube-api-access-cmzgj\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485305 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485312 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485321 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485339 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485351 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb7097f-d7d3-499e-b412-55f5946b5be4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.485359 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb7097f-d7d3-499e-b412-55f5946b5be4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.508070 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587460 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587641 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587700 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587739 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587792 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdp57\" (UniqueName: \"kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587843 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587884 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587909 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.587986 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.588020 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.588126 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.588290 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.592951 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.595634 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.598048 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.598812 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.604543 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdp57\" (UniqueName: \"kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.629718 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " pod="openstack/glance-default-external-api-0" Mar 18 16:02:20 crc kubenswrapper[4939]: I0318 16:02:20.750591 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.263975 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.264103 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5bb7097f-d7d3-499e-b412-55f5946b5be4","Type":"ContainerDied","Data":"8e7914c48b54971396969a48048593f83a308cae048bd61f7abddccdb68cb989"} Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.264343 4939 scope.go:117] "RemoveContainer" containerID="63e00c85d4fcd04f40420b32206738e70a2b26b69d681fd823a84fb57bf3b4bd" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.309716 4939 scope.go:117] "RemoveContainer" containerID="0c7fe9c4155b4831bd03228350c672b9c68571f96276cee2f3bba62cba25d888" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.310622 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.329042 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.354990 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.370627 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.372580 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.374420 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.375713 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.392272 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.408680 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.408741 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.408799 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8df8\" (UniqueName: \"kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.408825 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.409208 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.409262 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.409303 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.409323 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.510856 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511221 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511282 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8df8\" (UniqueName: \"kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511308 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511374 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511408 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511441 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511460 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.511456 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.512109 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.512697 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.523051 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.523095 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.531306 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.536067 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.541404 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8df8\" (UniqueName: \"kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.569829 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.639193 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.709344 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.717779 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xr89\" (UniqueName: \"kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89\") pod \"23191f5f-fe02-4b74-ab9c-95b03d308980\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.718370 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts\") pod \"23191f5f-fe02-4b74-ab9c-95b03d308980\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.718825 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data\") pod \"23191f5f-fe02-4b74-ab9c-95b03d308980\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.718852 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle\") pod \"23191f5f-fe02-4b74-ab9c-95b03d308980\" (UID: \"23191f5f-fe02-4b74-ab9c-95b03d308980\") " Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.723244 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts" (OuterVolumeSpecName: "scripts") pod "23191f5f-fe02-4b74-ab9c-95b03d308980" (UID: "23191f5f-fe02-4b74-ab9c-95b03d308980"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.723518 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89" (OuterVolumeSpecName: "kube-api-access-6xr89") pod "23191f5f-fe02-4b74-ab9c-95b03d308980" (UID: "23191f5f-fe02-4b74-ab9c-95b03d308980"). InnerVolumeSpecName "kube-api-access-6xr89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.747689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23191f5f-fe02-4b74-ab9c-95b03d308980" (UID: "23191f5f-fe02-4b74-ab9c-95b03d308980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.750839 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data" (OuterVolumeSpecName: "config-data") pod "23191f5f-fe02-4b74-ab9c-95b03d308980" (UID: "23191f5f-fe02-4b74-ab9c-95b03d308980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.819966 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.820328 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.820342 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xr89\" (UniqueName: \"kubernetes.io/projected/23191f5f-fe02-4b74-ab9c-95b03d308980-kube-api-access-6xr89\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:21 crc kubenswrapper[4939]: I0318 16:02:21.820353 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23191f5f-fe02-4b74-ab9c-95b03d308980-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.153649 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb7097f-d7d3-499e-b412-55f5946b5be4" path="/var/lib/kubelet/pods/5bb7097f-d7d3-499e-b412-55f5946b5be4/volumes" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.155264 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37a35fa-ce56-4934-88eb-fd24cc5aec4f" path="/var/lib/kubelet/pods/b37a35fa-ce56-4934-88eb-fd24cc5aec4f/volumes" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.241278 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.290736 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerStarted","Data":"db4849851f296a4419721c8e278fdc257d8f44daa292632908cb4dce6239c3c3"} Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.293737 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-98ww2" event={"ID":"23191f5f-fe02-4b74-ab9c-95b03d308980","Type":"ContainerDied","Data":"d1c170413f861b0654cf4b49f0809c974118e1b0dda7282f2e6c6c35961ac542"} Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.293785 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c170413f861b0654cf4b49f0809c974118e1b0dda7282f2e6c6c35961ac542" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.293886 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-98ww2" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.303272 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerStarted","Data":"d1edf6d434d6afaad7a360d12012a796e03fa6b5275394e5c67c30d4f75cf0c3"} Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.303316 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerStarted","Data":"8f0d5db536a4d73516c5e6363d7d361d8a0c0340543756582492aeebc2aa0eb4"} Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.363044 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:02:22 crc kubenswrapper[4939]: E0318 16:02:22.363488 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23191f5f-fe02-4b74-ab9c-95b03d308980" containerName="nova-cell0-conductor-db-sync" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.363518 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="23191f5f-fe02-4b74-ab9c-95b03d308980" containerName="nova-cell0-conductor-db-sync" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.363736 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="23191f5f-fe02-4b74-ab9c-95b03d308980" containerName="nova-cell0-conductor-db-sync" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.364617 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.367592 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.370894 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f6bpf" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.385583 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.531929 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.531990 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.532179 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqpm\" (UniqueName: \"kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.633597 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqpm\" (UniqueName: \"kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.633907 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.633928 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.645408 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.646152 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.661950 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqpm\" (UniqueName: \"kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm\") pod \"nova-cell0-conductor-0\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:22 crc kubenswrapper[4939]: I0318 16:02:22.692335 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.217174 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.314961 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerStarted","Data":"fe8d4154643530aaa76761b4600cb2e025a90e8cdab5159d3a196d0d9c425f73"} Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.318737 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09","Type":"ContainerStarted","Data":"2061dbed6e2b7aee40aa92639815309420d823bc42cfca582e1ecebebd707861"} Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.320069 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerStarted","Data":"6e356ce75b4202de8978925650d9357e6b5b88d4f5afbb33cbd95353ff874608"} Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.352029 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.352002437 podStartE2EDuration="3.352002437s" podCreationTimestamp="2026-03-18 16:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:23.341122741 +0000 UTC m=+1507.940310372" watchObservedRunningTime="2026-03-18 16:02:23.352002437 +0000 UTC m=+1507.951190068" Mar 18 16:02:23 crc kubenswrapper[4939]: I0318 16:02:23.613241 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:02:24 crc kubenswrapper[4939]: I0318 16:02:24.331787 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09","Type":"ContainerStarted","Data":"7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8"} Mar 18 16:02:24 crc kubenswrapper[4939]: I0318 16:02:24.335688 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerStarted","Data":"f026dc909ac22d1539817ec6b0e34f51819c81cb961582e5bb0f18556dfcbf46"} Mar 18 16:02:24 crc kubenswrapper[4939]: I0318 16:02:24.335854 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:24 crc kubenswrapper[4939]: I0318 16:02:24.361967 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.361941979 podStartE2EDuration="2.361941979s" podCreationTimestamp="2026-03-18 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:24.34896576 +0000 UTC m=+1508.948153381" watchObservedRunningTime="2026-03-18 16:02:24.361941979 +0000 UTC m=+1508.961129610" Mar 18 16:02:24 crc kubenswrapper[4939]: I0318 16:02:24.377229 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.37721184 podStartE2EDuration="3.37721184s" podCreationTimestamp="2026-03-18 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:24.365448532 +0000 UTC m=+1508.964636173" watchObservedRunningTime="2026-03-18 16:02:24.37721184 +0000 UTC m=+1508.976399461" Mar 18 16:02:30 crc kubenswrapper[4939]: I0318 16:02:30.751595 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:02:30 crc kubenswrapper[4939]: I0318 16:02:30.752107 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:02:30 crc kubenswrapper[4939]: I0318 16:02:30.786052 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:02:30 crc kubenswrapper[4939]: I0318 16:02:30.805140 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.397957 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.398189 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.710753 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.710823 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.756936 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:31 crc kubenswrapper[4939]: I0318 16:02:31.799317 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:32 crc kubenswrapper[4939]: I0318 16:02:32.407249 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:32 crc kubenswrapper[4939]: I0318 16:02:32.407588 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:32 crc kubenswrapper[4939]: I0318 16:02:32.729873 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.371610 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-d6b5l"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.373640 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.376385 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.376481 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.382012 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d6b5l"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.417849 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.417873 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.457364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.457947 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.458085 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.458128 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzv2\" (UniqueName: \"kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.561568 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzv2\" (UniqueName: \"kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.561668 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.561840 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.561879 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.569744 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.571385 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.572807 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.580987 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.585516 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.591032 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.603137 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzv2\" (UniqueName: \"kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2\") pod \"nova-cell0-cell-mapping-d6b5l\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.615797 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.666551 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjz4r\" (UniqueName: \"kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.666613 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.666671 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.707961 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.717462 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.718917 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.723954 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.767017 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.767978 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjz4r\" (UniqueName: \"kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768043 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768077 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pkp\" (UniqueName: \"kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768112 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768131 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.768173 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.784172 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.784777 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.788083 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjz4r\" (UniqueName: \"kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r\") pod \"nova-scheduler-0\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.791738 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.793356 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.798066 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.816113 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.855462 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.856846 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.864215 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.867020 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.872926 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.872978 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pkp\" (UniqueName: \"kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873090 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873121 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873145 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873203 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.873229 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5c2n\" (UniqueName: \"kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.874513 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.877006 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.884068 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.888899 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.897957 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pkp\" (UniqueName: \"kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.908301 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " pod="openstack/nova-metadata-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.916742 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.953788 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.953872 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979541 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979589 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979617 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwvg\" (UniqueName: \"kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979643 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979851 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.979873 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5c2n\" (UniqueName: \"kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981668 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nm8\" (UniqueName: \"kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981692 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981793 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.981909 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.986061 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.996131 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:33 crc kubenswrapper[4939]: I0318 16:02:33.996212 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.015692 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5c2n\" (UniqueName: \"kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n\") pod \"nova-api-0\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " pod="openstack/nova-api-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.030179 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085544 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085611 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085643 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085672 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwvg\" (UniqueName: \"kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085698 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085730 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nm8\" (UniqueName: \"kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085751 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085802 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.085829 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.086772 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.087340 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.088234 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.088831 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.089209 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.093147 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.100942 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.106026 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nm8\" (UniqueName: \"kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.128740 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwvg\" (UniqueName: \"kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg\") pod \"dnsmasq-dns-757b4f8459-w2vld\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.142344 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.173800 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.238245 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.267480 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.461891 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d6b5l"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.542976 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.559958 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.618023 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l7cqx"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.643412 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.647972 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.648714 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.653521 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l7cqx"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.706084 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.706191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpwxz\" (UniqueName: \"kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.706213 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.706239 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.807576 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.807694 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpwxz\" (UniqueName: \"kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.807715 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.807741 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.815254 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.820333 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.824257 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.831225 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpwxz\" (UniqueName: \"kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz\") pod \"nova-cell1-conductor-db-sync-l7cqx\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.915896 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.940116 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:02:34 crc kubenswrapper[4939]: I0318 16:02:34.951609 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.066320 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.066453 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.068968 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.086157 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.105160 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.463466 4939 generic.go:334] "Generic (PLEG): container finished" podID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerID="91175d6317c87e70c88d60795a34e70c160236d3230b7d173af7ff154643bd5c" exitCode=0 Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.463546 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" event={"ID":"f0645b84-f786-4e86-b405-64bcf3e5479a","Type":"ContainerDied","Data":"91175d6317c87e70c88d60795a34e70c160236d3230b7d173af7ff154643bd5c"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.463581 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" event={"ID":"f0645b84-f786-4e86-b405-64bcf3e5479a","Type":"ContainerStarted","Data":"9d383ffd4dd412801486852e50c48657c5f1bd4aef259f60b8e45fdc06fae973"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.465551 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"816c45f6-05f7-4cc3-818a-7af3cd52c0f0","Type":"ContainerStarted","Data":"29127ebb4dc9c8ac248c02f8abce1ceffde00b079e38c843e2665325f2ac23a7"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.467763 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerStarted","Data":"9f472f9ca43379717421a698477d03341305c43b8cb74478b1b3568339d8e728"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.469156 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerStarted","Data":"69870fa7646a1d64c227a1d3dae38906f3cfd5567d35c81eb72fa36d117ddcf4"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.472281 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d6b5l" event={"ID":"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b","Type":"ContainerStarted","Data":"3a6be7d31af70d20f3d1ed115c27e44245c15f8fcf4f832e8ac84984eb9faac3"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.472309 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d6b5l" event={"ID":"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b","Type":"ContainerStarted","Data":"513e16be2f4f990f00b21b088a3d2173cd31fedfa71f6f932c9040c0d07ae5b1"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.479499 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a0f417-8f6f-4fb2-93a1-1587c57dc814","Type":"ContainerStarted","Data":"70840d4200ca619b05fba536f3b81d726db1e2dbbd6e2359d0a63257bf89435d"} Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.596877 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-d6b5l" podStartSLOduration=2.5968608570000002 podStartE2EDuration="2.596860857s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:35.509735365 +0000 UTC m=+1520.108922986" watchObservedRunningTime="2026-03-18 16:02:35.596860857 +0000 UTC m=+1520.196048478" Mar 18 16:02:35 crc kubenswrapper[4939]: I0318 16:02:35.602926 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l7cqx"] Mar 18 16:02:36 crc kubenswrapper[4939]: E0318 16:02:36.473575 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c9933e1_34e7_493d_8e6c_7c2f317b2bea.slice/crio-conmon-be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c9933e1_34e7_493d_8e6c_7c2f317b2bea.slice/crio-be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.514305 4939 generic.go:334] "Generic (PLEG): container finished" podID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerID="be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17" exitCode=137 Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.514531 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerDied","Data":"be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17"} Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.521763 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" event={"ID":"f0645b84-f786-4e86-b405-64bcf3e5479a","Type":"ContainerStarted","Data":"1eb5f49cac096d75dec04516429155d1100467413b34819ed66d63ec5fa1847e"} Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.524047 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.537939 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" event={"ID":"dd94833e-ea91-4e35-9d05-e0cdf969b281","Type":"ContainerStarted","Data":"71000f98adb8392199a87db579844540c2369c38e83f05ef7e6212b75f62b671"} Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.537999 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" event={"ID":"dd94833e-ea91-4e35-9d05-e0cdf969b281","Type":"ContainerStarted","Data":"15e757c8e75e3bd780b77018b66b35503006dccfc2ad6fa78341ae4928bab834"} Mar 18 16:02:36 crc kubenswrapper[4939]: I0318 16:02:36.543619 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" podStartSLOduration=3.543597256 podStartE2EDuration="3.543597256s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:36.540491397 +0000 UTC m=+1521.139679018" watchObservedRunningTime="2026-03-18 16:02:36.543597256 +0000 UTC m=+1521.142784877" Mar 18 16:02:37 crc kubenswrapper[4939]: I0318 16:02:37.747251 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" podStartSLOduration=3.747227073 podStartE2EDuration="3.747227073s" podCreationTimestamp="2026-03-18 16:02:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:36.559994147 +0000 UTC m=+1521.159181768" watchObservedRunningTime="2026-03-18 16:02:37.747227073 +0000 UTC m=+1522.346414694" Mar 18 16:02:37 crc kubenswrapper[4939]: I0318 16:02:37.749076 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:02:37 crc kubenswrapper[4939]: I0318 16:02:37.763448 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.282960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.389771 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390314 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390410 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chnzp\" (UniqueName: \"kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390460 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.390526 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data\") pod \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\" (UID: \"9c9933e1-34e7-493d-8e6c-7c2f317b2bea\") " Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.391407 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.391741 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.391987 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.392007 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.400478 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp" (OuterVolumeSpecName: "kube-api-access-chnzp") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "kube-api-access-chnzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.401151 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts" (OuterVolumeSpecName: "scripts") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.475122 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.493444 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.493477 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.493487 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chnzp\" (UniqueName: \"kubernetes.io/projected/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-kube-api-access-chnzp\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.540640 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.567556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a0f417-8f6f-4fb2-93a1-1587c57dc814","Type":"ContainerStarted","Data":"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a"} Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.588056 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.249494562 podStartE2EDuration="5.58803556s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="2026-03-18 16:02:34.559481266 +0000 UTC m=+1519.158668887" lastFinishedPulling="2026-03-18 16:02:37.898022264 +0000 UTC m=+1522.497209885" observedRunningTime="2026-03-18 16:02:38.586791564 +0000 UTC m=+1523.185979185" watchObservedRunningTime="2026-03-18 16:02:38.58803556 +0000 UTC m=+1523.187223181" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.590795 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data" (OuterVolumeSpecName: "config-data") pod "9c9933e1-34e7-493d-8e6c-7c2f317b2bea" (UID: "9c9933e1-34e7-493d-8e6c-7c2f317b2bea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.599341 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.599368 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9933e1-34e7-493d-8e6c-7c2f317b2bea-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.608078 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c9933e1-34e7-493d-8e6c-7c2f317b2bea","Type":"ContainerDied","Data":"6e563db61c121029ef8f19546e91bfe526af8604349a5faa75cf5713478252d0"} Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.608273 4939 scope.go:117] "RemoveContainer" containerID="be626b66aa177ad50c1500b513cdf6b36327483af4d9965e5ceff333d24a7a17" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.608523 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.626904 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"816c45f6-05f7-4cc3-818a-7af3cd52c0f0","Type":"ContainerStarted","Data":"c8b177f35e38719eb0605090d30ecb234f6c72120376163fb7c98a38321a3d5c"} Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.627113 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c8b177f35e38719eb0605090d30ecb234f6c72120376163fb7c98a38321a3d5c" gracePeriod=30 Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.633882 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerStarted","Data":"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd"} Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.637364 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerStarted","Data":"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8"} Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.658379 4939 scope.go:117] "RemoveContainer" containerID="0e2146cc8977d8698c353eb075623269c19f9a2eca99422f79fc7689b45b5d51" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.660247 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.630056892 podStartE2EDuration="5.660231914s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="2026-03-18 16:02:34.945370338 +0000 UTC m=+1519.544557959" lastFinishedPulling="2026-03-18 16:02:37.97554536 +0000 UTC m=+1522.574732981" observedRunningTime="2026-03-18 16:02:38.657586308 +0000 UTC m=+1523.256773949" watchObservedRunningTime="2026-03-18 16:02:38.660231914 +0000 UTC m=+1523.259419535" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.687971 4939 scope.go:117] "RemoveContainer" containerID="f0b8449d93a90b893e475b1ef20eb751d717cae07a9a1c9aaa7076486a4a8094" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.688094 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.706406 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.722943 4939 scope.go:117] "RemoveContainer" containerID="bf6b73a54624d929f8384d8d28586b7e110ae2e005d3b0e33aec226c2e704a1c" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.744452 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:38 crc kubenswrapper[4939]: E0318 16:02:38.744856 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-central-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.744874 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-central-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: E0318 16:02:38.744887 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="sg-core" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.744892 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="sg-core" Mar 18 16:02:38 crc kubenswrapper[4939]: E0318 16:02:38.744901 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-notification-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.744906 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-notification-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: E0318 16:02:38.744941 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="proxy-httpd" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.744947 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="proxy-httpd" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.745114 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-notification-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.745129 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="proxy-httpd" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.745149 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="sg-core" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.745156 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" containerName="ceilometer-central-agent" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.748561 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.753010 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.754473 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.756497 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803406 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803453 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9hm\" (UniqueName: \"kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803493 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803562 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803580 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.803605 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905613 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905682 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905712 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9hm\" (UniqueName: \"kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905736 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905831 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905857 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.905890 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.906979 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.907052 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.912663 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.914208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.915232 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.916422 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:38 crc kubenswrapper[4939]: I0318 16:02:38.925288 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9hm\" (UniqueName: \"kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm\") pod \"ceilometer-0\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " pod="openstack/ceilometer-0" Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.030860 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.092075 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.238714 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.597934 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:39 crc kubenswrapper[4939]: W0318 16:02:39.597987 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67 WatchSource:0}: Error finding container 95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67: Status 404 returned error can't find the container with id 95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67 Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.647528 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerStarted","Data":"95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67"} Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.649324 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerStarted","Data":"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd"} Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.651106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerStarted","Data":"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90"} Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.651309 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-log" containerID="cri-o://59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" gracePeriod=30 Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.651411 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-metadata" containerID="cri-o://782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" gracePeriod=30 Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.674630 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.825902344 podStartE2EDuration="6.674608104s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="2026-03-18 16:02:35.133974945 +0000 UTC m=+1519.733162566" lastFinishedPulling="2026-03-18 16:02:37.982680695 +0000 UTC m=+1522.581868326" observedRunningTime="2026-03-18 16:02:39.667196131 +0000 UTC m=+1524.266383762" watchObservedRunningTime="2026-03-18 16:02:39.674608104 +0000 UTC m=+1524.273795745" Mar 18 16:02:39 crc kubenswrapper[4939]: I0318 16:02:39.692296 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.608942633 podStartE2EDuration="6.692271651s" podCreationTimestamp="2026-03-18 16:02:33 +0000 UTC" firstStartedPulling="2026-03-18 16:02:34.912442873 +0000 UTC m=+1519.511630494" lastFinishedPulling="2026-03-18 16:02:37.995771891 +0000 UTC m=+1522.594959512" observedRunningTime="2026-03-18 16:02:39.687721051 +0000 UTC m=+1524.286908672" watchObservedRunningTime="2026-03-18 16:02:39.692271651 +0000 UTC m=+1524.291459272" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.146849 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c9933e1-34e7-493d-8e6c-7c2f317b2bea" path="/var/lib/kubelet/pods/9c9933e1-34e7-493d-8e6c-7c2f317b2bea/volumes" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.235270 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.340357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data\") pod \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.340550 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs\") pod \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.340611 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle\") pod \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.340649 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pkp\" (UniqueName: \"kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp\") pod \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\" (UID: \"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e\") " Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.340863 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs" (OuterVolumeSpecName: "logs") pod "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" (UID: "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.341339 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.348718 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp" (OuterVolumeSpecName: "kube-api-access-h6pkp") pod "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" (UID: "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e"). InnerVolumeSpecName "kube-api-access-h6pkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.384995 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data" (OuterVolumeSpecName: "config-data") pod "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" (UID: "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.395985 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" (UID: "744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.443700 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.443945 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.444012 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pkp\" (UniqueName: \"kubernetes.io/projected/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e-kube-api-access-h6pkp\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663053 4939 generic.go:334] "Generic (PLEG): container finished" podID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerID="782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" exitCode=0 Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663383 4939 generic.go:334] "Generic (PLEG): container finished" podID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerID="59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" exitCode=143 Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663107 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663141 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerDied","Data":"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90"} Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663487 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerDied","Data":"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8"} Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663560 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e","Type":"ContainerDied","Data":"69870fa7646a1d64c227a1d3dae38906f3cfd5567d35c81eb72fa36d117ddcf4"} Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.663598 4939 scope.go:117] "RemoveContainer" containerID="782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.666336 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerStarted","Data":"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940"} Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.685062 4939 scope.go:117] "RemoveContainer" containerID="59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.705190 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.711739 4939 scope.go:117] "RemoveContainer" containerID="782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" Mar 18 16:02:40 crc kubenswrapper[4939]: E0318 16:02:40.712165 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90\": container with ID starting with 782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90 not found: ID does not exist" containerID="782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.712196 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90"} err="failed to get container status \"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90\": rpc error: code = NotFound desc = could not find container \"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90\": container with ID starting with 782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90 not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.712220 4939 scope.go:117] "RemoveContainer" containerID="59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" Mar 18 16:02:40 crc kubenswrapper[4939]: E0318 16:02:40.715102 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8\": container with ID starting with 59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8 not found: ID does not exist" containerID="59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.715141 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8"} err="failed to get container status \"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8\": rpc error: code = NotFound desc = could not find container \"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8\": container with ID starting with 59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8 not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.715175 4939 scope.go:117] "RemoveContainer" containerID="782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.715470 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90"} err="failed to get container status \"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90\": rpc error: code = NotFound desc = could not find container \"782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90\": container with ID starting with 782ae8a87b720683df37d88fe8690a6d913d721bcaca8f9ca97264af83753f90 not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.715492 4939 scope.go:117] "RemoveContainer" containerID="59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.717776 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8"} err="failed to get container status \"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8\": rpc error: code = NotFound desc = could not find container \"59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8\": container with ID starting with 59f7b8b1983a78ceed0442c9ad28694ad89ca0707d275ddb51102e5f1ca8f5f8 not found: ID does not exist" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.723203 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.734045 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:40 crc kubenswrapper[4939]: E0318 16:02:40.734544 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-metadata" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.734565 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-metadata" Mar 18 16:02:40 crc kubenswrapper[4939]: E0318 16:02:40.734586 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-log" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.734593 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-log" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.734771 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-metadata" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.734798 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" containerName="nova-metadata-log" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.735988 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.737978 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.742389 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.744235 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.854212 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.854294 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.854327 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.854751 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.854821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vj2t\" (UniqueName: \"kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.956613 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.956658 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.956677 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.956720 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.956734 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vj2t\" (UniqueName: \"kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.959797 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.960038 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.960668 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.970225 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:40 crc kubenswrapper[4939]: I0318 16:02:40.977642 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vj2t\" (UniqueName: \"kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t\") pod \"nova-metadata-0\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " pod="openstack/nova-metadata-0" Mar 18 16:02:41 crc kubenswrapper[4939]: I0318 16:02:41.062130 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:41 crc kubenswrapper[4939]: I0318 16:02:41.600268 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:41 crc kubenswrapper[4939]: W0318 16:02:41.608243 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef2eabfa_9280_4760_9f8d_4137db4aa6de.slice/crio-0790c1fd5ce4d636000375da65aed66dd8364ae624da9f45e122e89e3552f6ae WatchSource:0}: Error finding container 0790c1fd5ce4d636000375da65aed66dd8364ae624da9f45e122e89e3552f6ae: Status 404 returned error can't find the container with id 0790c1fd5ce4d636000375da65aed66dd8364ae624da9f45e122e89e3552f6ae Mar 18 16:02:41 crc kubenswrapper[4939]: I0318 16:02:41.676680 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerStarted","Data":"0790c1fd5ce4d636000375da65aed66dd8364ae624da9f45e122e89e3552f6ae"} Mar 18 16:02:41 crc kubenswrapper[4939]: I0318 16:02:41.678802 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerStarted","Data":"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca"} Mar 18 16:02:42 crc kubenswrapper[4939]: I0318 16:02:42.153384 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e" path="/var/lib/kubelet/pods/744895f6-fa8e-40b8-a6a1-9dd7ec9cf74e/volumes" Mar 18 16:02:42 crc kubenswrapper[4939]: I0318 16:02:42.690948 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerStarted","Data":"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1"} Mar 18 16:02:42 crc kubenswrapper[4939]: I0318 16:02:42.691001 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerStarted","Data":"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07"} Mar 18 16:02:42 crc kubenswrapper[4939]: I0318 16:02:42.709278 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerStarted","Data":"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1"} Mar 18 16:02:42 crc kubenswrapper[4939]: I0318 16:02:42.728536 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.728498167 podStartE2EDuration="2.728498167s" podCreationTimestamp="2026-03-18 16:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:42.726562781 +0000 UTC m=+1527.325750452" watchObservedRunningTime="2026-03-18 16:02:42.728498167 +0000 UTC m=+1527.327685788" Mar 18 16:02:43 crc kubenswrapper[4939]: I0318 16:02:43.721037 4939 generic.go:334] "Generic (PLEG): container finished" podID="dd94833e-ea91-4e35-9d05-e0cdf969b281" containerID="71000f98adb8392199a87db579844540c2369c38e83f05ef7e6212b75f62b671" exitCode=0 Mar 18 16:02:43 crc kubenswrapper[4939]: I0318 16:02:43.721208 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" event={"ID":"dd94833e-ea91-4e35-9d05-e0cdf969b281","Type":"ContainerDied","Data":"71000f98adb8392199a87db579844540c2369c38e83f05ef7e6212b75f62b671"} Mar 18 16:02:43 crc kubenswrapper[4939]: I0318 16:02:43.723112 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" containerID="3a6be7d31af70d20f3d1ed115c27e44245c15f8fcf4f832e8ac84984eb9faac3" exitCode=0 Mar 18 16:02:43 crc kubenswrapper[4939]: I0318 16:02:43.723552 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d6b5l" event={"ID":"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b","Type":"ContainerDied","Data":"3a6be7d31af70d20f3d1ed115c27e44245c15f8fcf4f832e8ac84984eb9faac3"} Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.031464 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.058022 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.174757 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.175098 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.269752 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.341539 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.341829 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="dnsmasq-dns" containerID="cri-o://3766a4508c68813eb4e6a699ecc370667f5af46a13bf4ef93364692bcdd83176" gracePeriod=10 Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.756998 4939 generic.go:334] "Generic (PLEG): container finished" podID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerID="3766a4508c68813eb4e6a699ecc370667f5af46a13bf4ef93364692bcdd83176" exitCode=0 Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.757269 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" event={"ID":"d7a2fd0c-9437-43fb-af24-c971bb657d12","Type":"ContainerDied","Data":"3766a4508c68813eb4e6a699ecc370667f5af46a13bf4ef93364692bcdd83176"} Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.761807 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerStarted","Data":"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b"} Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.762272 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.824784 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.313222645 podStartE2EDuration="6.824757369s" podCreationTimestamp="2026-03-18 16:02:38 +0000 UTC" firstStartedPulling="2026-03-18 16:02:39.600323991 +0000 UTC m=+1524.199511612" lastFinishedPulling="2026-03-18 16:02:44.111858715 +0000 UTC m=+1528.711046336" observedRunningTime="2026-03-18 16:02:44.778948104 +0000 UTC m=+1529.378135725" watchObservedRunningTime="2026-03-18 16:02:44.824757369 +0000 UTC m=+1529.423944990" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.853442 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:02:44 crc kubenswrapper[4939]: I0318 16:02:44.941456 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.060423 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.060762 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.061712 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.061845 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw5w5\" (UniqueName: \"kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.061920 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.062080 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb\") pod \"d7a2fd0c-9437-43fb-af24-c971bb657d12\" (UID: \"d7a2fd0c-9437-43fb-af24-c971bb657d12\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.075058 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5" (OuterVolumeSpecName: "kube-api-access-jw5w5") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "kube-api-access-jw5w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.115216 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.148006 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.163209 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.166808 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.166839 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw5w5\" (UniqueName: \"kubernetes.io/projected/d7a2fd0c-9437-43fb-af24-c971bb657d12-kube-api-access-jw5w5\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.166852 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.166864 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.184663 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.188421 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config" (OuterVolumeSpecName: "config") pod "d7a2fd0c-9437-43fb-af24-c971bb657d12" (UID: "d7a2fd0c-9437-43fb-af24-c971bb657d12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.258667 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.258712 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.268604 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.268867 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a2fd0c-9437-43fb-af24-c971bb657d12-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.339138 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.346320 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473053 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts\") pod \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473117 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data\") pod \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473139 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts\") pod \"dd94833e-ea91-4e35-9d05-e0cdf969b281\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473214 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzzv2\" (UniqueName: \"kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2\") pod \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473237 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle\") pod \"dd94833e-ea91-4e35-9d05-e0cdf969b281\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473285 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data\") pod \"dd94833e-ea91-4e35-9d05-e0cdf969b281\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473362 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle\") pod \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\" (UID: \"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.473411 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpwxz\" (UniqueName: \"kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz\") pod \"dd94833e-ea91-4e35-9d05-e0cdf969b281\" (UID: \"dd94833e-ea91-4e35-9d05-e0cdf969b281\") " Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.477662 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts" (OuterVolumeSpecName: "scripts") pod "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" (UID: "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.481934 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2" (OuterVolumeSpecName: "kube-api-access-pzzv2") pod "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" (UID: "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b"). InnerVolumeSpecName "kube-api-access-pzzv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.482749 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz" (OuterVolumeSpecName: "kube-api-access-cpwxz") pod "dd94833e-ea91-4e35-9d05-e0cdf969b281" (UID: "dd94833e-ea91-4e35-9d05-e0cdf969b281"). InnerVolumeSpecName "kube-api-access-cpwxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.493464 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts" (OuterVolumeSpecName: "scripts") pod "dd94833e-ea91-4e35-9d05-e0cdf969b281" (UID: "dd94833e-ea91-4e35-9d05-e0cdf969b281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.517679 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" (UID: "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.518803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd94833e-ea91-4e35-9d05-e0cdf969b281" (UID: "dd94833e-ea91-4e35-9d05-e0cdf969b281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.525298 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data" (OuterVolumeSpecName: "config-data") pod "dd94833e-ea91-4e35-9d05-e0cdf969b281" (UID: "dd94833e-ea91-4e35-9d05-e0cdf969b281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.547900 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data" (OuterVolumeSpecName: "config-data") pod "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" (UID: "5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576207 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576239 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpwxz\" (UniqueName: \"kubernetes.io/projected/dd94833e-ea91-4e35-9d05-e0cdf969b281-kube-api-access-cpwxz\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576250 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576259 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576267 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576274 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzzv2\" (UniqueName: \"kubernetes.io/projected/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b-kube-api-access-pzzv2\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576283 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.576292 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd94833e-ea91-4e35-9d05-e0cdf969b281-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.775900 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d6b5l" event={"ID":"5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b","Type":"ContainerDied","Data":"513e16be2f4f990f00b21b088a3d2173cd31fedfa71f6f932c9040c0d07ae5b1"} Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.775957 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="513e16be2f4f990f00b21b088a3d2173cd31fedfa71f6f932c9040c0d07ae5b1" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.775911 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d6b5l" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.778410 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" event={"ID":"dd94833e-ea91-4e35-9d05-e0cdf969b281","Type":"ContainerDied","Data":"15e757c8e75e3bd780b77018b66b35503006dccfc2ad6fa78341ae4928bab834"} Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.778443 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e757c8e75e3bd780b77018b66b35503006dccfc2ad6fa78341ae4928bab834" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.778410 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l7cqx" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.782599 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.782622 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lcsjd" event={"ID":"d7a2fd0c-9437-43fb-af24-c971bb657d12","Type":"ContainerDied","Data":"0797f479740a3dd2104d7e04289d3a3dcbc3b21d25648a91602438446f25aa18"} Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.782694 4939 scope.go:117] "RemoveContainer" containerID="3766a4508c68813eb4e6a699ecc370667f5af46a13bf4ef93364692bcdd83176" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.825707 4939 scope.go:117] "RemoveContainer" containerID="6951caea9ceff1764b6a6267761009eeb4fe1ee82aff191f3f729bab18efb226" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.852839 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.873599 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lcsjd"] Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.886574 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:02:45 crc kubenswrapper[4939]: E0318 16:02:45.886953 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="dnsmasq-dns" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.886970 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="dnsmasq-dns" Mar 18 16:02:45 crc kubenswrapper[4939]: E0318 16:02:45.886984 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd94833e-ea91-4e35-9d05-e0cdf969b281" containerName="nova-cell1-conductor-db-sync" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.886990 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd94833e-ea91-4e35-9d05-e0cdf969b281" containerName="nova-cell1-conductor-db-sync" Mar 18 16:02:45 crc kubenswrapper[4939]: E0318 16:02:45.887003 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="init" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887010 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="init" Mar 18 16:02:45 crc kubenswrapper[4939]: E0318 16:02:45.887043 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" containerName="nova-manage" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887049 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" containerName="nova-manage" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887203 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" containerName="nova-manage" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887220 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" containerName="dnsmasq-dns" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887231 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd94833e-ea91-4e35-9d05-e0cdf969b281" containerName="nova-cell1-conductor-db-sync" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.887863 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.891979 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.901070 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.952691 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.953257 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-log" containerID="cri-o://dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd" gracePeriod=30 Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.953603 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-api" containerID="cri-o://0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd" gracePeriod=30 Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.984922 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.992003 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.992175 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:45 crc kubenswrapper[4939]: I0318 16:02:45.992286 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjb69\" (UniqueName: \"kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.015369 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.015800 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-log" containerID="cri-o://7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" gracePeriod=30 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.015877 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-metadata" containerID="cri-o://1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" gracePeriod=30 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.093865 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.093929 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjb69\" (UniqueName: \"kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.094052 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.099964 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.100137 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.110717 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjb69\" (UniqueName: \"kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69\") pod \"nova-cell1-conductor-0\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.144354 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a2fd0c-9437-43fb-af24-c971bb657d12" path="/var/lib/kubelet/pods/d7a2fd0c-9437-43fb-af24-c971bb657d12/volumes" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.216638 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.679478 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.794994 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerID="dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd" exitCode=143 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.795064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerDied","Data":"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd"} Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798224 4939 generic.go:334] "Generic (PLEG): container finished" podID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerID="1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" exitCode=0 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798256 4939 generic.go:334] "Generic (PLEG): container finished" podID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerID="7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" exitCode=143 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798301 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerDied","Data":"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1"} Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798332 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerDied","Data":"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07"} Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798345 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef2eabfa-9280-4760-9f8d-4137db4aa6de","Type":"ContainerDied","Data":"0790c1fd5ce4d636000375da65aed66dd8364ae624da9f45e122e89e3552f6ae"} Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798363 4939 scope.go:117] "RemoveContainer" containerID="1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.798532 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.800785 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerName="nova-scheduler-scheduler" containerID="cri-o://042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" gracePeriod=30 Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.820277 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle\") pod \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.820358 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs\") pod \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.820403 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vj2t\" (UniqueName: \"kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t\") pod \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.820434 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs\") pod \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.820551 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data\") pod \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\" (UID: \"ef2eabfa-9280-4760-9f8d-4137db4aa6de\") " Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.821188 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs" (OuterVolumeSpecName: "logs") pod "ef2eabfa-9280-4760-9f8d-4137db4aa6de" (UID: "ef2eabfa-9280-4760-9f8d-4137db4aa6de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.828090 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t" (OuterVolumeSpecName: "kube-api-access-6vj2t") pod "ef2eabfa-9280-4760-9f8d-4137db4aa6de" (UID: "ef2eabfa-9280-4760-9f8d-4137db4aa6de"). InnerVolumeSpecName "kube-api-access-6vj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.828947 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.840564 4939 scope.go:117] "RemoveContainer" containerID="7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.856410 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef2eabfa-9280-4760-9f8d-4137db4aa6de" (UID: "ef2eabfa-9280-4760-9f8d-4137db4aa6de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.883445 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data" (OuterVolumeSpecName: "config-data") pod "ef2eabfa-9280-4760-9f8d-4137db4aa6de" (UID: "ef2eabfa-9280-4760-9f8d-4137db4aa6de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.884123 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ef2eabfa-9280-4760-9f8d-4137db4aa6de" (UID: "ef2eabfa-9280-4760-9f8d-4137db4aa6de"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.904403 4939 scope.go:117] "RemoveContainer" containerID="1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" Mar 18 16:02:46 crc kubenswrapper[4939]: E0318 16:02:46.904914 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1\": container with ID starting with 1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1 not found: ID does not exist" containerID="1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.904954 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1"} err="failed to get container status \"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1\": rpc error: code = NotFound desc = could not find container \"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1\": container with ID starting with 1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1 not found: ID does not exist" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.904979 4939 scope.go:117] "RemoveContainer" containerID="7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" Mar 18 16:02:46 crc kubenswrapper[4939]: E0318 16:02:46.905378 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07\": container with ID starting with 7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07 not found: ID does not exist" containerID="7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.905409 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07"} err="failed to get container status \"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07\": rpc error: code = NotFound desc = could not find container \"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07\": container with ID starting with 7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07 not found: ID does not exist" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.905428 4939 scope.go:117] "RemoveContainer" containerID="1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.905662 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1"} err="failed to get container status \"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1\": rpc error: code = NotFound desc = could not find container \"1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1\": container with ID starting with 1f32c058f6ac9adefbbdaad5ee1804ddccbb6e0ea33ee34b874d74444ecaaed1 not found: ID does not exist" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.905690 4939 scope.go:117] "RemoveContainer" containerID="7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.905943 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07"} err="failed to get container status \"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07\": rpc error: code = NotFound desc = could not find container \"7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07\": container with ID starting with 7d865c87dd2279f6e574ab17ffacd3915eeee5794ffca98c1d8e96b7f199bc07 not found: ID does not exist" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.922583 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.922626 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.922640 4939 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2eabfa-9280-4760-9f8d-4137db4aa6de-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.922651 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vj2t\" (UniqueName: \"kubernetes.io/projected/ef2eabfa-9280-4760-9f8d-4137db4aa6de-kube-api-access-6vj2t\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:46 crc kubenswrapper[4939]: I0318 16:02:46.922661 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef2eabfa-9280-4760-9f8d-4137db4aa6de-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.135570 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.159755 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.204315 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:47 crc kubenswrapper[4939]: E0318 16:02:47.204891 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-metadata" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.204919 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-metadata" Mar 18 16:02:47 crc kubenswrapper[4939]: E0318 16:02:47.204969 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-log" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.204980 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-log" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.205204 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-log" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.205228 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" containerName="nova-metadata-metadata" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.206470 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.210811 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.211102 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.234446 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.331784 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.332732 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.332873 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.333198 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.333296 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlttg\" (UniqueName: \"kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.436994 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.437459 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.437727 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.437867 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.438062 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.438228 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlttg\" (UniqueName: \"kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.441948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.442473 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.451158 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.458054 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlttg\" (UniqueName: \"kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg\") pod \"nova-metadata-0\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " pod="openstack/nova-metadata-0" Mar 18 16:02:47 crc kubenswrapper[4939]: I0318 16:02:47.545139 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:47.811611 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c2e6985-9642-41e2-8b6f-174c96e86281","Type":"ContainerStarted","Data":"1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a"} Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:47.812124 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:47.812160 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c2e6985-9642-41e2-8b6f-174c96e86281","Type":"ContainerStarted","Data":"16459dd59964fcc95cb07e9c30bfc0b5edaaa956338ec60ee1fd29f25e3033b3"} Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:47.839007 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.838981323 podStartE2EDuration="2.838981323s" podCreationTimestamp="2026-03-18 16:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:47.826391921 +0000 UTC m=+1532.425579542" watchObservedRunningTime="2026-03-18 16:02:47.838981323 +0000 UTC m=+1532.438168944" Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:48.149617 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2eabfa-9280-4760-9f8d-4137db4aa6de" path="/var/lib/kubelet/pods/ef2eabfa-9280-4760-9f8d-4137db4aa6de/volumes" Mar 18 16:02:48 crc kubenswrapper[4939]: I0318 16:02:48.908299 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:02:49 crc kubenswrapper[4939]: E0318 16:02:49.033130 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:02:49 crc kubenswrapper[4939]: E0318 16:02:49.034713 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:02:49 crc kubenswrapper[4939]: E0318 16:02:49.036156 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:02:49 crc kubenswrapper[4939]: E0318 16:02:49.036236 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerName="nova-scheduler-scheduler" Mar 18 16:02:49 crc kubenswrapper[4939]: I0318 16:02:49.835938 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerStarted","Data":"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3"} Mar 18 16:02:49 crc kubenswrapper[4939]: I0318 16:02:49.836319 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerStarted","Data":"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746"} Mar 18 16:02:49 crc kubenswrapper[4939]: I0318 16:02:49.836343 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerStarted","Data":"0134adece0a074802cb77e324312d3e76e25d3f6c26aafac79b0adad9dd2bde7"} Mar 18 16:02:49 crc kubenswrapper[4939]: I0318 16:02:49.870755 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.870736851 podStartE2EDuration="2.870736851s" podCreationTimestamp="2026-03-18 16:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:49.858872511 +0000 UTC m=+1534.458060132" watchObservedRunningTime="2026-03-18 16:02:49.870736851 +0000 UTC m=+1534.469924492" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.376152 4939 scope.go:117] "RemoveContainer" containerID="6c0358b9cc3e41dcf12f335fc25d33f18016832b8c3a3c4c40ffe7889b013fed" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.446681 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.594365 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle\") pod \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.594411 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data\") pod \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.594464 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjz4r\" (UniqueName: \"kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r\") pod \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\" (UID: \"66a0f417-8f6f-4fb2-93a1-1587c57dc814\") " Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.600241 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r" (OuterVolumeSpecName: "kube-api-access-xjz4r") pod "66a0f417-8f6f-4fb2-93a1-1587c57dc814" (UID: "66a0f417-8f6f-4fb2-93a1-1587c57dc814"). InnerVolumeSpecName "kube-api-access-xjz4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.626852 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data" (OuterVolumeSpecName: "config-data") pod "66a0f417-8f6f-4fb2-93a1-1587c57dc814" (UID: "66a0f417-8f6f-4fb2-93a1-1587c57dc814"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.628904 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66a0f417-8f6f-4fb2-93a1-1587c57dc814" (UID: "66a0f417-8f6f-4fb2-93a1-1587c57dc814"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.697143 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.697185 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a0f417-8f6f-4fb2-93a1-1587c57dc814-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.697198 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjz4r\" (UniqueName: \"kubernetes.io/projected/66a0f417-8f6f-4fb2-93a1-1587c57dc814-kube-api-access-xjz4r\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.845953 4939 generic.go:334] "Generic (PLEG): container finished" podID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" exitCode=0 Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.846005 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.846070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a0f417-8f6f-4fb2-93a1-1587c57dc814","Type":"ContainerDied","Data":"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a"} Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.846144 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a0f417-8f6f-4fb2-93a1-1587c57dc814","Type":"ContainerDied","Data":"70840d4200ca619b05fba536f3b81d726db1e2dbbd6e2359d0a63257bf89435d"} Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.846183 4939 scope.go:117] "RemoveContainer" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.872200 4939 scope.go:117] "RemoveContainer" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" Mar 18 16:02:50 crc kubenswrapper[4939]: E0318 16:02:50.872731 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a\": container with ID starting with 042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a not found: ID does not exist" containerID="042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.872839 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a"} err="failed to get container status \"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a\": rpc error: code = NotFound desc = could not find container \"042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a\": container with ID starting with 042a9bcf5a785280463419b5dcce46ad5771660795087977e7d206f3a6fdfc9a not found: ID does not exist" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.879153 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.892660 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.900164 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:50 crc kubenswrapper[4939]: E0318 16:02:50.900540 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerName="nova-scheduler-scheduler" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.900554 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerName="nova-scheduler-scheduler" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.900724 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" containerName="nova-scheduler-scheduler" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.901291 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.903226 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:02:50 crc kubenswrapper[4939]: I0318 16:02:50.911653 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.003189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.003230 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.003335 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jdc\" (UniqueName: \"kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.104555 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.104676 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.104866 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jdc\" (UniqueName: \"kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.109974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.110314 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.127831 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jdc\" (UniqueName: \"kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc\") pod \"nova-scheduler-0\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.225772 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.723190 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:02:51 crc kubenswrapper[4939]: W0318 16:02:51.737770 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f552e3e_ab72_491d_a1e3_e44e86a523be.slice/crio-30d7e758bc8d1892b57f27014f7c02764b92e5e6f90afe5ff29eed8586800f8a WatchSource:0}: Error finding container 30d7e758bc8d1892b57f27014f7c02764b92e5e6f90afe5ff29eed8586800f8a: Status 404 returned error can't find the container with id 30d7e758bc8d1892b57f27014f7c02764b92e5e6f90afe5ff29eed8586800f8a Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.759826 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.817083 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5c2n\" (UniqueName: \"kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n\") pod \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.817346 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs\") pod \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.817373 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle\") pod \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.817397 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data\") pod \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\" (UID: \"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87\") " Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.818117 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs" (OuterVolumeSpecName: "logs") pod "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" (UID: "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.822641 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n" (OuterVolumeSpecName: "kube-api-access-h5c2n") pod "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" (UID: "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87"). InnerVolumeSpecName "kube-api-access-h5c2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.844154 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" (UID: "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858672 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerID="0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd" exitCode=0 Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858734 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerDied","Data":"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd"} Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858760 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca9843a3-eed7-4c1f-b2fd-aae3e214fe87","Type":"ContainerDied","Data":"9f472f9ca43379717421a698477d03341305c43b8cb74478b1b3568339d8e728"} Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858774 4939 scope.go:117] "RemoveContainer" containerID="0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858869 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data" (OuterVolumeSpecName: "config-data") pod "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" (UID: "ca9843a3-eed7-4c1f-b2fd-aae3e214fe87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.858893 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.868225 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f552e3e-ab72-491d-a1e3-e44e86a523be","Type":"ContainerStarted","Data":"30d7e758bc8d1892b57f27014f7c02764b92e5e6f90afe5ff29eed8586800f8a"} Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.905343 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.909495 4939 scope.go:117] "RemoveContainer" containerID="dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.919235 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.919261 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.919272 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.919282 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5c2n\" (UniqueName: \"kubernetes.io/projected/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87-kube-api-access-h5c2n\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.934555 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.940701 4939 scope.go:117] "RemoveContainer" containerID="0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd" Mar 18 16:02:51 crc kubenswrapper[4939]: E0318 16:02:51.941329 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd\": container with ID starting with 0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd not found: ID does not exist" containerID="0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.941360 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd"} err="failed to get container status \"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd\": rpc error: code = NotFound desc = could not find container \"0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd\": container with ID starting with 0ee7a1d664bd2fdce5c52edc98a0f3b4aa563445e4e5ebf37d9fee6b09d1cddd not found: ID does not exist" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.941382 4939 scope.go:117] "RemoveContainer" containerID="dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd" Mar 18 16:02:51 crc kubenswrapper[4939]: E0318 16:02:51.941558 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd\": container with ID starting with dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd not found: ID does not exist" containerID="dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.941574 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd"} err="failed to get container status \"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd\": rpc error: code = NotFound desc = could not find container \"dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd\": container with ID starting with dba2b2246c718a24b048d61d81e893f474bfa38906243eb88862b162553b6ecd not found: ID does not exist" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.958239 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:51 crc kubenswrapper[4939]: E0318 16:02:51.958811 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-api" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.958826 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-api" Mar 18 16:02:51 crc kubenswrapper[4939]: E0318 16:02:51.958841 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-log" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.958846 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-log" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.959100 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-api" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.959119 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" containerName="nova-api-log" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.960105 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.969835 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:02:51 crc kubenswrapper[4939]: I0318 16:02:51.978038 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.020034 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.020132 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.020363 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.020498 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmmsf\" (UniqueName: \"kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.122064 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.122194 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.122243 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmmsf\" (UniqueName: \"kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.122283 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.123263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.128383 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.136417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.148364 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmmsf\" (UniqueName: \"kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf\") pod \"nova-api-0\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.153580 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a0f417-8f6f-4fb2-93a1-1587c57dc814" path="/var/lib/kubelet/pods/66a0f417-8f6f-4fb2-93a1-1587c57dc814/volumes" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.154307 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9843a3-eed7-4c1f-b2fd-aae3e214fe87" path="/var/lib/kubelet/pods/ca9843a3-eed7-4c1f-b2fd-aae3e214fe87/volumes" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.282188 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.759049 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:02:52 crc kubenswrapper[4939]: W0318 16:02:52.769074 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b8c6876_ac7a_4462_be8c_05fc63eb086e.slice/crio-12c84544d9258a19903c378505e384e0aa2b90a3ce855b929262bc350e329a32 WatchSource:0}: Error finding container 12c84544d9258a19903c378505e384e0aa2b90a3ce855b929262bc350e329a32: Status 404 returned error can't find the container with id 12c84544d9258a19903c378505e384e0aa2b90a3ce855b929262bc350e329a32 Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.897143 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f552e3e-ab72-491d-a1e3-e44e86a523be","Type":"ContainerStarted","Data":"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417"} Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.898654 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerStarted","Data":"12c84544d9258a19903c378505e384e0aa2b90a3ce855b929262bc350e329a32"} Mar 18 16:02:52 crc kubenswrapper[4939]: I0318 16:02:52.926447 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.926429757 podStartE2EDuration="2.926429757s" podCreationTimestamp="2026-03-18 16:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:52.919767276 +0000 UTC m=+1537.518954897" watchObservedRunningTime="2026-03-18 16:02:52.926429757 +0000 UTC m=+1537.525617378" Mar 18 16:02:53 crc kubenswrapper[4939]: I0318 16:02:53.915715 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerStarted","Data":"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca"} Mar 18 16:02:53 crc kubenswrapper[4939]: I0318 16:02:53.916169 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerStarted","Data":"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc"} Mar 18 16:02:53 crc kubenswrapper[4939]: I0318 16:02:53.937319 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.937300337 podStartE2EDuration="2.937300337s" podCreationTimestamp="2026-03-18 16:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:53.933365434 +0000 UTC m=+1538.532553075" watchObservedRunningTime="2026-03-18 16:02:53.937300337 +0000 UTC m=+1538.536487958" Mar 18 16:02:56 crc kubenswrapper[4939]: I0318 16:02:56.227773 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:02:56 crc kubenswrapper[4939]: I0318 16:02:56.249183 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 16:02:57 crc kubenswrapper[4939]: I0318 16:02:57.547555 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:02:57 crc kubenswrapper[4939]: I0318 16:02:57.547867 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:02:58 crc kubenswrapper[4939]: I0318 16:02:58.568858 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:02:58 crc kubenswrapper[4939]: I0318 16:02:58.568892 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:01 crc kubenswrapper[4939]: I0318 16:03:01.226915 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:03:01 crc kubenswrapper[4939]: I0318 16:03:01.266151 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:03:02 crc kubenswrapper[4939]: I0318 16:03:02.027058 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:03:02 crc kubenswrapper[4939]: I0318 16:03:02.282840 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:02 crc kubenswrapper[4939]: I0318 16:03:02.282903 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:03 crc kubenswrapper[4939]: I0318 16:03:03.365758 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:03 crc kubenswrapper[4939]: I0318 16:03:03.365813 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:05 crc kubenswrapper[4939]: I0318 16:03:05.547391 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:03:05 crc kubenswrapper[4939]: I0318 16:03:05.547987 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:03:07 crc kubenswrapper[4939]: I0318 16:03:07.553655 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:03:07 crc kubenswrapper[4939]: I0318 16:03:07.554637 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:03:07 crc kubenswrapper[4939]: I0318 16:03:07.559715 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:03:08 crc kubenswrapper[4939]: I0318 16:03:08.057359 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.063947 4939 generic.go:334] "Generic (PLEG): container finished" podID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" containerID="c8b177f35e38719eb0605090d30ecb234f6c72120376163fb7c98a38321a3d5c" exitCode=137 Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.064068 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"816c45f6-05f7-4cc3-818a-7af3cd52c0f0","Type":"ContainerDied","Data":"c8b177f35e38719eb0605090d30ecb234f6c72120376163fb7c98a38321a3d5c"} Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.064368 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"816c45f6-05f7-4cc3-818a-7af3cd52c0f0","Type":"ContainerDied","Data":"29127ebb4dc9c8ac248c02f8abce1ceffde00b079e38c843e2665325f2ac23a7"} Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.064386 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29127ebb4dc9c8ac248c02f8abce1ceffde00b079e38c843e2665325f2ac23a7" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.085145 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.099454 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.169959 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle\") pod \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.170252 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data\") pod \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.170346 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5nm8\" (UniqueName: \"kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8\") pod \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\" (UID: \"816c45f6-05f7-4cc3-818a-7af3cd52c0f0\") " Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.176964 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8" (OuterVolumeSpecName: "kube-api-access-j5nm8") pod "816c45f6-05f7-4cc3-818a-7af3cd52c0f0" (UID: "816c45f6-05f7-4cc3-818a-7af3cd52c0f0"). InnerVolumeSpecName "kube-api-access-j5nm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.224705 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "816c45f6-05f7-4cc3-818a-7af3cd52c0f0" (UID: "816c45f6-05f7-4cc3-818a-7af3cd52c0f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.246718 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data" (OuterVolumeSpecName: "config-data") pod "816c45f6-05f7-4cc3-818a-7af3cd52c0f0" (UID: "816c45f6-05f7-4cc3-818a-7af3cd52c0f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.273750 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.273779 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5nm8\" (UniqueName: \"kubernetes.io/projected/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-kube-api-access-j5nm8\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:09 crc kubenswrapper[4939]: I0318 16:03:09.273789 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/816c45f6-05f7-4cc3-818a-7af3cd52c0f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.071497 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.116863 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.126274 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.143055 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" path="/var/lib/kubelet/pods/816c45f6-05f7-4cc3-818a-7af3cd52c0f0/volumes" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.143636 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:03:10 crc kubenswrapper[4939]: E0318 16:03:10.143993 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.144008 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.144179 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="816c45f6-05f7-4cc3-818a-7af3cd52c0f0" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.144897 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.147208 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.148499 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.150975 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.166541 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.190939 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.191109 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.191244 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zwz7\" (UniqueName: \"kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.191545 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.191661 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.283121 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.283207 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.293827 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.293879 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.293915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zwz7\" (UniqueName: \"kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.293967 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.293990 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.310231 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.310280 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.310294 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.310554 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.314126 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zwz7\" (UniqueName: \"kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.467806 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:10 crc kubenswrapper[4939]: W0318 16:03:10.892227 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddccb7f64_b0e8_4fc1_b1d7_1a24eec17666.slice/crio-26b1b68141ee4115efbefc73f870731415408c53354fcc9d891efecb5768be41 WatchSource:0}: Error finding container 26b1b68141ee4115efbefc73f870731415408c53354fcc9d891efecb5768be41: Status 404 returned error can't find the container with id 26b1b68141ee4115efbefc73f870731415408c53354fcc9d891efecb5768be41 Mar 18 16:03:10 crc kubenswrapper[4939]: I0318 16:03:10.892851 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:03:11 crc kubenswrapper[4939]: I0318 16:03:11.082578 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666","Type":"ContainerStarted","Data":"26b1b68141ee4115efbefc73f870731415408c53354fcc9d891efecb5768be41"} Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.094238 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666","Type":"ContainerStarted","Data":"fd953f11b183a67f290c13bd5195a8236a5c036552d0d61a21e2836c0d47adfb"} Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.116436 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.116419 podStartE2EDuration="2.116419s" podCreationTimestamp="2026-03-18 16:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:12.113400354 +0000 UTC m=+1556.712587975" watchObservedRunningTime="2026-03-18 16:03:12.116419 +0000 UTC m=+1556.715606621" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.286288 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.286365 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.291084 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.293179 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.466199 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.468047 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.489140 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576078 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576209 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576231 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2cs\" (UniqueName: \"kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576306 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.576333 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677450 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677516 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677641 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677662 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677705 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2cs\" (UniqueName: \"kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.677734 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.678579 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.678594 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.678637 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.678732 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.678992 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.703159 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2cs\" (UniqueName: \"kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs\") pod \"dnsmasq-dns-89c5cd4d5-rqsf2\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:12 crc kubenswrapper[4939]: I0318 16:03:12.792784 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:13 crc kubenswrapper[4939]: I0318 16:03:13.349136 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.112173 4939 generic.go:334] "Generic (PLEG): container finished" podID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerID="d3c3b598ec28d293c825f1d0d8a6e7d069aaa425447f5ae732747e50bc76e8c6" exitCode=0 Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.112291 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" event={"ID":"be388cde-0dc7-4b42-a62c-f790b70391c6","Type":"ContainerDied","Data":"d3c3b598ec28d293c825f1d0d8a6e7d069aaa425447f5ae732747e50bc76e8c6"} Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.112556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" event={"ID":"be388cde-0dc7-4b42-a62c-f790b70391c6","Type":"ContainerStarted","Data":"fcb0c665b45b585050778d9fdd48d53c42fe0e155c812c70d4f78934cfc0de1d"} Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.608557 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.609126 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-central-agent" containerID="cri-o://285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940" gracePeriod=30 Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.609172 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="proxy-httpd" containerID="cri-o://41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b" gracePeriod=30 Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.609218 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-notification-agent" containerID="cri-o://9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca" gracePeriod=30 Mar 18 16:03:14 crc kubenswrapper[4939]: I0318 16:03:14.609218 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="sg-core" containerID="cri-o://e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1" gracePeriod=30 Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.126850 4939 generic.go:334] "Generic (PLEG): container finished" podID="0a123209-990a-4fcc-a66a-967ab7007653" containerID="41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b" exitCode=0 Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.127411 4939 generic.go:334] "Generic (PLEG): container finished" podID="0a123209-990a-4fcc-a66a-967ab7007653" containerID="e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1" exitCode=2 Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.127489 4939 generic.go:334] "Generic (PLEG): container finished" podID="0a123209-990a-4fcc-a66a-967ab7007653" containerID="285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940" exitCode=0 Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.127644 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerDied","Data":"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b"} Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.127778 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerDied","Data":"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1"} Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.127861 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerDied","Data":"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940"} Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.130442 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" event={"ID":"be388cde-0dc7-4b42-a62c-f790b70391c6","Type":"ContainerStarted","Data":"4aa3a3cedd5d1ba6985398a8cde097556bb47737cbb52dca79f049e1751ffcc0"} Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.130755 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.164471 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" podStartSLOduration=3.164449249 podStartE2EDuration="3.164449249s" podCreationTimestamp="2026-03-18 16:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:15.154952809 +0000 UTC m=+1559.754140440" watchObservedRunningTime="2026-03-18 16:03:15.164449249 +0000 UTC m=+1559.763636870" Mar 18 16:03:15 crc kubenswrapper[4939]: I0318 16:03:15.468755 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:16 crc kubenswrapper[4939]: I0318 16:03:16.108645 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:16 crc kubenswrapper[4939]: I0318 16:03:16.108898 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-log" containerID="cri-o://21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc" gracePeriod=30 Mar 18 16:03:16 crc kubenswrapper[4939]: I0318 16:03:16.109479 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-api" containerID="cri-o://08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca" gracePeriod=30 Mar 18 16:03:16 crc kubenswrapper[4939]: I0318 16:03:16.544965 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:16 crc kubenswrapper[4939]: I0318 16:03:16.545673 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="36eb8417-2815-4eea-987e-d05be6eb90e9" containerName="kube-state-metrics" containerID="cri-o://e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c" gracePeriod=30 Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.108959 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.148551 4939 generic.go:334] "Generic (PLEG): container finished" podID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerID="21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc" exitCode=143 Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.148612 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerDied","Data":"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc"} Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.149896 4939 generic.go:334] "Generic (PLEG): container finished" podID="36eb8417-2815-4eea-987e-d05be6eb90e9" containerID="e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c" exitCode=2 Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.149942 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36eb8417-2815-4eea-987e-d05be6eb90e9","Type":"ContainerDied","Data":"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c"} Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.149966 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36eb8417-2815-4eea-987e-d05be6eb90e9","Type":"ContainerDied","Data":"72eae1310188b05753171d0524380b41e47b879d10033b781ab8ad8a8f966dbe"} Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.149982 4939 scope.go:117] "RemoveContainer" containerID="e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.150103 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.198227 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2m88\" (UniqueName: \"kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88\") pod \"36eb8417-2815-4eea-987e-d05be6eb90e9\" (UID: \"36eb8417-2815-4eea-987e-d05be6eb90e9\") " Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.257298 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88" (OuterVolumeSpecName: "kube-api-access-s2m88") pod "36eb8417-2815-4eea-987e-d05be6eb90e9" (UID: "36eb8417-2815-4eea-987e-d05be6eb90e9"). InnerVolumeSpecName "kube-api-access-s2m88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.268433 4939 scope.go:117] "RemoveContainer" containerID="e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c" Mar 18 16:03:17 crc kubenswrapper[4939]: E0318 16:03:17.293925 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c\": container with ID starting with e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c not found: ID does not exist" containerID="e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.293983 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c"} err="failed to get container status \"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c\": rpc error: code = NotFound desc = could not find container \"e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c\": container with ID starting with e08221d8cca8dc9ea01531a9d40c75de41471d7f85c85721ed37b9e7f83a069c not found: ID does not exist" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.302905 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2m88\" (UniqueName: \"kubernetes.io/projected/36eb8417-2815-4eea-987e-d05be6eb90e9-kube-api-access-s2m88\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.485842 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.497473 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.507318 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:17 crc kubenswrapper[4939]: E0318 16:03:17.507684 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb8417-2815-4eea-987e-d05be6eb90e9" containerName="kube-state-metrics" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.507701 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb8417-2815-4eea-987e-d05be6eb90e9" containerName="kube-state-metrics" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.507916 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb8417-2815-4eea-987e-d05be6eb90e9" containerName="kube-state-metrics" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.508494 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.517071 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.517314 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.547877 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.608161 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.609090 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.609282 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.609447 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhvh\" (UniqueName: \"kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.711555 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.711933 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhvh\" (UniqueName: \"kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.711991 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.712115 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.721608 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.727047 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.729341 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.750582 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhvh\" (UniqueName: \"kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh\") pod \"kube-state-metrics-0\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.912920 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:03:17 crc kubenswrapper[4939]: I0318 16:03:17.953981 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025433 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025497 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025566 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn9hm\" (UniqueName: \"kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025640 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025667 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025746 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025767 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle\") pod \"0a123209-990a-4fcc-a66a-967ab7007653\" (UID: \"0a123209-990a-4fcc-a66a-967ab7007653\") " Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.025890 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.026266 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.026311 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.029589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts" (OuterVolumeSpecName: "scripts") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.030790 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm" (OuterVolumeSpecName: "kube-api-access-bn9hm") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "kube-api-access-bn9hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.051726 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.124923 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.128251 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.128292 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn9hm\" (UniqueName: \"kubernetes.io/projected/0a123209-990a-4fcc-a66a-967ab7007653-kube-api-access-bn9hm\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.128307 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.128319 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a123209-990a-4fcc-a66a-967ab7007653-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.128332 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.144638 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36eb8417-2815-4eea-987e-d05be6eb90e9" path="/var/lib/kubelet/pods/36eb8417-2815-4eea-987e-d05be6eb90e9/volumes" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.163399 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data" (OuterVolumeSpecName: "config-data") pod "0a123209-990a-4fcc-a66a-967ab7007653" (UID: "0a123209-990a-4fcc-a66a-967ab7007653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.165549 4939 generic.go:334] "Generic (PLEG): container finished" podID="0a123209-990a-4fcc-a66a-967ab7007653" containerID="9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca" exitCode=0 Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.165595 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerDied","Data":"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca"} Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.165628 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a123209-990a-4fcc-a66a-967ab7007653","Type":"ContainerDied","Data":"95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67"} Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.165648 4939 scope.go:117] "RemoveContainer" containerID="41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.165728 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.203890 4939 scope.go:117] "RemoveContainer" containerID="e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.215239 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.231030 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a123209-990a-4fcc-a66a-967ab7007653-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.240811 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.248184 4939 scope.go:117] "RemoveContainer" containerID="9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.252176 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.252753 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="proxy-httpd" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.252770 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="proxy-httpd" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.252790 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-central-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.252797 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-central-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.252809 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="sg-core" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.252816 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="sg-core" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.252827 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-notification-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.252835 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-notification-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.253029 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-central-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.253051 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="proxy-httpd" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.253062 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="ceilometer-notification-agent" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.253080 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a123209-990a-4fcc-a66a-967ab7007653" containerName="sg-core" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.257465 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.260880 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.261097 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.264167 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.282892 4939 scope.go:117] "RemoveContainer" containerID="285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.313268 4939 scope.go:117] "RemoveContainer" containerID="41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.314180 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b\": container with ID starting with 41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b not found: ID does not exist" containerID="41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.314240 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b"} err="failed to get container status \"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b\": rpc error: code = NotFound desc = could not find container \"41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b\": container with ID starting with 41cb867ba10383c774dffa454bff7515ae8e514abdc427818b0ec1febd34393b not found: ID does not exist" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.314304 4939 scope.go:117] "RemoveContainer" containerID="e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.314869 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1\": container with ID starting with e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1 not found: ID does not exist" containerID="e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.314911 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1"} err="failed to get container status \"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1\": rpc error: code = NotFound desc = could not find container \"e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1\": container with ID starting with e721f50c919f0f7d103c1bde54b5c286f26848704f82ad6cf6ed7e1ea5c4edb1 not found: ID does not exist" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.314934 4939 scope.go:117] "RemoveContainer" containerID="9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.315220 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca\": container with ID starting with 9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca not found: ID does not exist" containerID="9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.315242 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca"} err="failed to get container status \"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca\": rpc error: code = NotFound desc = could not find container \"9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca\": container with ID starting with 9edb9ed1343def971831d46bd53e8fbee469dedee415698241949bb848a233ca not found: ID does not exist" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.315256 4939 scope.go:117] "RemoveContainer" containerID="285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940" Mar 18 16:03:18 crc kubenswrapper[4939]: E0318 16:03:18.315551 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940\": container with ID starting with 285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940 not found: ID does not exist" containerID="285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.315580 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940"} err="failed to get container status \"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940\": rpc error: code = NotFound desc = could not find container \"285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940\": container with ID starting with 285f27ac31abffcf17f84fdf96658a21a7fc9e76fb89f3a551c15e1ed1dc7940 not found: ID does not exist" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.333788 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.333881 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.333913 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6rw\" (UniqueName: \"kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.334061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.334118 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.334149 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.334189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.394489 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.435621 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436082 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436120 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436150 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436204 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436286 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6rw\" (UniqueName: \"kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.436979 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.437051 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.441731 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.442143 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.442732 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.442736 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.458025 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6rw\" (UniqueName: \"kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw\") pod \"ceilometer-0\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " pod="openstack/ceilometer-0" Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.583716 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:18 crc kubenswrapper[4939]: I0318 16:03:18.584394 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.047859 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.187124 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerStarted","Data":"aec74e00490a1a1f757f017935dbd0c13bb52c0c4b4fb6fb6c3b3eee35a59b3c"} Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.190762 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51b567b0-935e-46f6-8cf7-3c8a9040bad4","Type":"ContainerStarted","Data":"286772eca49783d61fb3dec6079a9246af7f326c26fda03b975b7c1244c77555"} Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.190815 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51b567b0-935e-46f6-8cf7-3c8a9040bad4","Type":"ContainerStarted","Data":"eddc21d2a7edc5f56890f75c04a8c25d5d8793651129b3625fe9c7fadd124f43"} Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.191019 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.209718 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8281383949999999 podStartE2EDuration="2.209703853s" podCreationTimestamp="2026-03-18 16:03:17 +0000 UTC" firstStartedPulling="2026-03-18 16:03:18.401254699 +0000 UTC m=+1563.000442320" lastFinishedPulling="2026-03-18 16:03:18.782820157 +0000 UTC m=+1563.382007778" observedRunningTime="2026-03-18 16:03:19.205818423 +0000 UTC m=+1563.805006044" watchObservedRunningTime="2026-03-18 16:03:19.209703853 +0000 UTC m=+1563.808891474" Mar 18 16:03:19 crc kubenswrapper[4939]: E0318 16:03:19.588188 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.750707 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.872868 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data\") pod \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.872917 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmmsf\" (UniqueName: \"kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf\") pod \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.872982 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle\") pod \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.873036 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs\") pod \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\" (UID: \"4b8c6876-ac7a-4462-be8c-05fc63eb086e\") " Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.873792 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs" (OuterVolumeSpecName: "logs") pod "4b8c6876-ac7a-4462-be8c-05fc63eb086e" (UID: "4b8c6876-ac7a-4462-be8c-05fc63eb086e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.874415 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b8c6876-ac7a-4462-be8c-05fc63eb086e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.878397 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf" (OuterVolumeSpecName: "kube-api-access-cmmsf") pod "4b8c6876-ac7a-4462-be8c-05fc63eb086e" (UID: "4b8c6876-ac7a-4462-be8c-05fc63eb086e"). InnerVolumeSpecName "kube-api-access-cmmsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.916733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b8c6876-ac7a-4462-be8c-05fc63eb086e" (UID: "4b8c6876-ac7a-4462-be8c-05fc63eb086e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.944946 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data" (OuterVolumeSpecName: "config-data") pod "4b8c6876-ac7a-4462-be8c-05fc63eb086e" (UID: "4b8c6876-ac7a-4462-be8c-05fc63eb086e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.978712 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.978742 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmmsf\" (UniqueName: \"kubernetes.io/projected/4b8c6876-ac7a-4462-be8c-05fc63eb086e-kube-api-access-cmmsf\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:19 crc kubenswrapper[4939]: I0318 16:03:19.978755 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b8c6876-ac7a-4462-be8c-05fc63eb086e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.146000 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a123209-990a-4fcc-a66a-967ab7007653" path="/var/lib/kubelet/pods/0a123209-990a-4fcc-a66a-967ab7007653/volumes" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.201380 4939 generic.go:334] "Generic (PLEG): container finished" podID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerID="08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca" exitCode=0 Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.201466 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.202571 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerDied","Data":"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca"} Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.202685 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b8c6876-ac7a-4462-be8c-05fc63eb086e","Type":"ContainerDied","Data":"12c84544d9258a19903c378505e384e0aa2b90a3ce855b929262bc350e329a32"} Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.202768 4939 scope.go:117] "RemoveContainer" containerID="08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.206751 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerStarted","Data":"8cf83e2d1e562d111d8e113bf14f68c990c8bfbe6e899916c1afb6b3e7c0b429"} Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.228168 4939 scope.go:117] "RemoveContainer" containerID="21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.233019 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.247126 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.260894 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:20 crc kubenswrapper[4939]: E0318 16:03:20.261874 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-api" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.261900 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-api" Mar 18 16:03:20 crc kubenswrapper[4939]: E0318 16:03:20.261936 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-log" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.261947 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-log" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.262307 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-log" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.262378 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" containerName="nova-api-api" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.263990 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.268187 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.268460 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.268602 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.271881 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.272057 4939 scope.go:117] "RemoveContainer" containerID="08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca" Mar 18 16:03:20 crc kubenswrapper[4939]: E0318 16:03:20.282449 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca\": container with ID starting with 08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca not found: ID does not exist" containerID="08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.282736 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca"} err="failed to get container status \"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca\": rpc error: code = NotFound desc = could not find container \"08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca\": container with ID starting with 08acb6a83dac393a9150eae8a980997bf95c8070e449169c7d3a9c9e8eda23ca not found: ID does not exist" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.282771 4939 scope.go:117] "RemoveContainer" containerID="21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc" Mar 18 16:03:20 crc kubenswrapper[4939]: E0318 16:03:20.284329 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc\": container with ID starting with 21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc not found: ID does not exist" containerID="21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.284380 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc"} err="failed to get container status \"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc\": rpc error: code = NotFound desc = could not find container \"21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc\": container with ID starting with 21e0931b2149b86efa2f9705fef00930498c612b269cd11c054c91bc1045cdbc not found: ID does not exist" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.392563 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.392677 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.392697 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.392941 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.393030 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhnc5\" (UniqueName: \"kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.393095 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.469013 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.491189 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.494914 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.494990 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.495020 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.495086 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.495117 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhnc5\" (UniqueName: \"kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.495158 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.495765 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.499077 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.499263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.499959 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.500623 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.518663 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhnc5\" (UniqueName: \"kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5\") pod \"nova-api-0\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " pod="openstack/nova-api-0" Mar 18 16:03:20 crc kubenswrapper[4939]: I0318 16:03:20.604035 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.075334 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:21 crc kubenswrapper[4939]: W0318 16:03:21.080151 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3410c1e_8e0a_4e7f_921b_439936fa77b8.slice/crio-0eea09975d527be3af4cd123ea13b81f3873fe763fe50f9afd92866c27560aa0 WatchSource:0}: Error finding container 0eea09975d527be3af4cd123ea13b81f3873fe763fe50f9afd92866c27560aa0: Status 404 returned error can't find the container with id 0eea09975d527be3af4cd123ea13b81f3873fe763fe50f9afd92866c27560aa0 Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.219084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerStarted","Data":"0eea09975d527be3af4cd123ea13b81f3873fe763fe50f9afd92866c27560aa0"} Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.221946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerStarted","Data":"96a327ea52ed3bd0e30cba99e2dd8bb3b2bbeec6107f31009cbab51e0e0170ef"} Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.240573 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.521793 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mghgn"] Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.523688 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.526890 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.527034 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.540820 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mghgn"] Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.617861 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.617898 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.617922 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.617972 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-472j7\" (UniqueName: \"kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.719213 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.719255 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.719278 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.719315 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-472j7\" (UniqueName: \"kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.723980 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.729369 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.730496 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.738711 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-472j7\" (UniqueName: \"kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7\") pod \"nova-cell1-cell-mapping-mghgn\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:21 crc kubenswrapper[4939]: I0318 16:03:21.840647 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.146229 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8c6876-ac7a-4462-be8c-05fc63eb086e" path="/var/lib/kubelet/pods/4b8c6876-ac7a-4462-be8c-05fc63eb086e/volumes" Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.233772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerStarted","Data":"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7"} Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.233831 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerStarted","Data":"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b"} Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.237946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerStarted","Data":"e507dc33b5e9f5d472b1db0f1b93c904c76d29e02bd778951ecc2b9a80156efa"} Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.262282 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.262262991 podStartE2EDuration="2.262262991s" podCreationTimestamp="2026-03-18 16:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:22.254219312 +0000 UTC m=+1566.853406933" watchObservedRunningTime="2026-03-18 16:03:22.262262991 +0000 UTC m=+1566.861450612" Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.323638 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mghgn"] Mar 18 16:03:22 crc kubenswrapper[4939]: W0318 16:03:22.325706 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb451f8_d612_468e_9514_80d063ea89e6.slice/crio-2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40 WatchSource:0}: Error finding container 2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40: Status 404 returned error can't find the container with id 2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40 Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.795054 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.868317 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:03:22 crc kubenswrapper[4939]: I0318 16:03:22.868563 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="dnsmasq-dns" containerID="cri-o://1eb5f49cac096d75dec04516429155d1100467413b34819ed66d63ec5fa1847e" gracePeriod=10 Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.257099 4939 generic.go:334] "Generic (PLEG): container finished" podID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerID="1eb5f49cac096d75dec04516429155d1100467413b34819ed66d63ec5fa1847e" exitCode=0 Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.257149 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" event={"ID":"f0645b84-f786-4e86-b405-64bcf3e5479a","Type":"ContainerDied","Data":"1eb5f49cac096d75dec04516429155d1100467413b34819ed66d63ec5fa1847e"} Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.258865 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mghgn" event={"ID":"1eb451f8-d612-468e-9514-80d063ea89e6","Type":"ContainerStarted","Data":"758d17fe92530594fb55767b69458f7e5b9641b4227fd7a174da37c39963bf7e"} Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.258919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mghgn" event={"ID":"1eb451f8-d612-468e-9514-80d063ea89e6","Type":"ContainerStarted","Data":"2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40"} Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.290009 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mghgn" podStartSLOduration=2.289967884 podStartE2EDuration="2.289967884s" podCreationTimestamp="2026-03-18 16:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:23.274271337 +0000 UTC m=+1567.873458958" watchObservedRunningTime="2026-03-18 16:03:23.289967884 +0000 UTC m=+1567.889155505" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.397259 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.456747 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.457071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.457320 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.457451 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.457567 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.457698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjwvg\" (UniqueName: \"kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg\") pod \"f0645b84-f786-4e86-b405-64bcf3e5479a\" (UID: \"f0645b84-f786-4e86-b405-64bcf3e5479a\") " Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.476118 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg" (OuterVolumeSpecName: "kube-api-access-cjwvg") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "kube-api-access-cjwvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.515282 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.520265 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.525936 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config" (OuterVolumeSpecName: "config") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.527436 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.553519 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0645b84-f786-4e86-b405-64bcf3e5479a" (UID: "f0645b84-f786-4e86-b405-64bcf3e5479a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559910 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559947 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559957 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559966 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559975 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0645b84-f786-4e86-b405-64bcf3e5479a-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:23 crc kubenswrapper[4939]: I0318 16:03:23.559982 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjwvg\" (UniqueName: \"kubernetes.io/projected/f0645b84-f786-4e86-b405-64bcf3e5479a-kube-api-access-cjwvg\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.340020 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.340772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-w2vld" event={"ID":"f0645b84-f786-4e86-b405-64bcf3e5479a","Type":"ContainerDied","Data":"9d383ffd4dd412801486852e50c48657c5f1bd4aef259f60b8e45fdc06fae973"} Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.340809 4939 scope.go:117] "RemoveContainer" containerID="1eb5f49cac096d75dec04516429155d1100467413b34819ed66d63ec5fa1847e" Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.381095 4939 scope.go:117] "RemoveContainer" containerID="91175d6317c87e70c88d60795a34e70c160236d3230b7d173af7ff154643bd5c" Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.392775 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:03:24 crc kubenswrapper[4939]: I0318 16:03:24.403721 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-w2vld"] Mar 18 16:03:25 crc kubenswrapper[4939]: I0318 16:03:25.353296 4939 generic.go:334] "Generic (PLEG): container finished" podID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerID="a2e20f81d888c77a1b6d633915868f827fbbaa9e2fd63b64183ace8c8813a23e" exitCode=1 Mar 18 16:03:25 crc kubenswrapper[4939]: I0318 16:03:25.353361 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerDied","Data":"a2e20f81d888c77a1b6d633915868f827fbbaa9e2fd63b64183ace8c8813a23e"} Mar 18 16:03:25 crc kubenswrapper[4939]: I0318 16:03:25.353605 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="sg-core" containerID="cri-o://e507dc33b5e9f5d472b1db0f1b93c904c76d29e02bd778951ecc2b9a80156efa" gracePeriod=30 Mar 18 16:03:25 crc kubenswrapper[4939]: I0318 16:03:25.353690 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-central-agent" containerID="cri-o://8cf83e2d1e562d111d8e113bf14f68c990c8bfbe6e899916c1afb6b3e7c0b429" gracePeriod=30 Mar 18 16:03:25 crc kubenswrapper[4939]: I0318 16:03:25.353702 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-notification-agent" containerID="cri-o://96a327ea52ed3bd0e30cba99e2dd8bb3b2bbeec6107f31009cbab51e0e0170ef" gracePeriod=30 Mar 18 16:03:26 crc kubenswrapper[4939]: I0318 16:03:26.144591 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" path="/var/lib/kubelet/pods/f0645b84-f786-4e86-b405-64bcf3e5479a/volumes" Mar 18 16:03:26 crc kubenswrapper[4939]: I0318 16:03:26.371539 4939 generic.go:334] "Generic (PLEG): container finished" podID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerID="e507dc33b5e9f5d472b1db0f1b93c904c76d29e02bd778951ecc2b9a80156efa" exitCode=2 Mar 18 16:03:26 crc kubenswrapper[4939]: I0318 16:03:26.371570 4939 generic.go:334] "Generic (PLEG): container finished" podID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerID="96a327ea52ed3bd0e30cba99e2dd8bb3b2bbeec6107f31009cbab51e0e0170ef" exitCode=0 Mar 18 16:03:26 crc kubenswrapper[4939]: I0318 16:03:26.371557 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerDied","Data":"e507dc33b5e9f5d472b1db0f1b93c904c76d29e02bd778951ecc2b9a80156efa"} Mar 18 16:03:26 crc kubenswrapper[4939]: I0318 16:03:26.371630 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerDied","Data":"96a327ea52ed3bd0e30cba99e2dd8bb3b2bbeec6107f31009cbab51e0e0170ef"} Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.388270 4939 generic.go:334] "Generic (PLEG): container finished" podID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerID="8cf83e2d1e562d111d8e113bf14f68c990c8bfbe6e899916c1afb6b3e7c0b429" exitCode=0 Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.388356 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerDied","Data":"8cf83e2d1e562d111d8e113bf14f68c990c8bfbe6e899916c1afb6b3e7c0b429"} Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.683232 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760038 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760095 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760129 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760194 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6rw\" (UniqueName: \"kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760289 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760358 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle\") pod \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\" (UID: \"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b\") " Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760903 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.760977 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.765524 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw" (OuterVolumeSpecName: "kube-api-access-kf6rw") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "kube-api-access-kf6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.768697 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts" (OuterVolumeSpecName: "scripts") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.788674 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.833196 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862785 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862823 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862836 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf6rw\" (UniqueName: \"kubernetes.io/projected/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-kube-api-access-kf6rw\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862849 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862862 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.862873 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.873761 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data" (OuterVolumeSpecName: "config-data") pod "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" (UID: "5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.924356 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 16:03:27 crc kubenswrapper[4939]: I0318 16:03:27.968034 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.399443 4939 generic.go:334] "Generic (PLEG): container finished" podID="1eb451f8-d612-468e-9514-80d063ea89e6" containerID="758d17fe92530594fb55767b69458f7e5b9641b4227fd7a174da37c39963bf7e" exitCode=0 Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.399572 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mghgn" event={"ID":"1eb451f8-d612-468e-9514-80d063ea89e6","Type":"ContainerDied","Data":"758d17fe92530594fb55767b69458f7e5b9641b4227fd7a174da37c39963bf7e"} Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.404093 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b","Type":"ContainerDied","Data":"aec74e00490a1a1f757f017935dbd0c13bb52c0c4b4fb6fb6c3b3eee35a59b3c"} Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.404157 4939 scope.go:117] "RemoveContainer" containerID="a2e20f81d888c77a1b6d633915868f827fbbaa9e2fd63b64183ace8c8813a23e" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.404106 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.425254 4939 scope.go:117] "RemoveContainer" containerID="e507dc33b5e9f5d472b1db0f1b93c904c76d29e02bd778951ecc2b9a80156efa" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.446020 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.470252 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.475228 4939 scope.go:117] "RemoveContainer" containerID="96a327ea52ed3bd0e30cba99e2dd8bb3b2bbeec6107f31009cbab51e0e0170ef" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.487671 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488332 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-central-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488364 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-central-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488398 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="init" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488409 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="init" Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488425 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="dnsmasq-dns" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488432 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="dnsmasq-dns" Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488465 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="proxy-httpd" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488477 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="proxy-httpd" Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488525 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="sg-core" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488535 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="sg-core" Mar 18 16:03:28 crc kubenswrapper[4939]: E0318 16:03:28.488570 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-notification-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488581 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-notification-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488824 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0645b84-f786-4e86-b405-64bcf3e5479a" containerName="dnsmasq-dns" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488856 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-central-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488879 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="sg-core" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488892 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="ceilometer-notification-agent" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.488914 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" containerName="proxy-httpd" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.491331 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.496000 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.497681 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.497885 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.505281 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.549629 4939 scope.go:117] "RemoveContainer" containerID="8cf83e2d1e562d111d8e113bf14f68c990c8bfbe6e899916c1afb6b3e7c0b429" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578708 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578772 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578795 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578815 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578856 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578888 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.578910 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.579121 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x5vr\" (UniqueName: \"kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.680821 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x5vr\" (UniqueName: \"kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.680918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.680945 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.680965 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.680983 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.681019 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.681048 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.681067 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.682632 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.683109 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.686317 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.686519 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.686919 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.691211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.696373 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.700659 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x5vr\" (UniqueName: \"kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr\") pod \"ceilometer-0\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " pod="openstack/ceilometer-0" Mar 18 16:03:28 crc kubenswrapper[4939]: I0318 16:03:28.820139 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.259243 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.413382 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerStarted","Data":"8272265b72ee4d4872c95fe5c0e90555bf8c90f06e68e3b6dd6431d7c44b6235"} Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.797869 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:29 crc kubenswrapper[4939]: E0318 16:03:29.895327 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.906050 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle\") pod \"1eb451f8-d612-468e-9514-80d063ea89e6\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.906328 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data\") pod \"1eb451f8-d612-468e-9514-80d063ea89e6\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.906451 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-472j7\" (UniqueName: \"kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7\") pod \"1eb451f8-d612-468e-9514-80d063ea89e6\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.906595 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts\") pod \"1eb451f8-d612-468e-9514-80d063ea89e6\" (UID: \"1eb451f8-d612-468e-9514-80d063ea89e6\") " Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.909727 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7" (OuterVolumeSpecName: "kube-api-access-472j7") pod "1eb451f8-d612-468e-9514-80d063ea89e6" (UID: "1eb451f8-d612-468e-9514-80d063ea89e6"). InnerVolumeSpecName "kube-api-access-472j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.912657 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts" (OuterVolumeSpecName: "scripts") pod "1eb451f8-d612-468e-9514-80d063ea89e6" (UID: "1eb451f8-d612-468e-9514-80d063ea89e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.944204 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data" (OuterVolumeSpecName: "config-data") pod "1eb451f8-d612-468e-9514-80d063ea89e6" (UID: "1eb451f8-d612-468e-9514-80d063ea89e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:29 crc kubenswrapper[4939]: I0318 16:03:29.947326 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eb451f8-d612-468e-9514-80d063ea89e6" (UID: "1eb451f8-d612-468e-9514-80d063ea89e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.008957 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.009050 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.009065 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb451f8-d612-468e-9514-80d063ea89e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.009077 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-472j7\" (UniqueName: \"kubernetes.io/projected/1eb451f8-d612-468e-9514-80d063ea89e6-kube-api-access-472j7\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.151466 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b" path="/var/lib/kubelet/pods/5d7ecf4b-34e3-466f-9de6-c218d8bc5b0b/volumes" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.424077 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerStarted","Data":"8f7b8af072015fc83aa897993785d0ba6ece0ebbc142762b52dcd98324367fa6"} Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.425789 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mghgn" event={"ID":"1eb451f8-d612-468e-9514-80d063ea89e6","Type":"ContainerDied","Data":"2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40"} Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.425817 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a194918630ec8e71f482bac4582739d4afea03dda57f23eee82b16aa993fe40" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.425836 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mghgn" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.606693 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.606751 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.611792 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.635544 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.635793 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerName="nova-scheduler-scheduler" containerID="cri-o://bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" gracePeriod=30 Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.647243 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.647495 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-log" containerID="cri-o://d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746" gracePeriod=30 Mar 18 16:03:30 crc kubenswrapper[4939]: I0318 16:03:30.647580 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-metadata" containerID="cri-o://71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3" gracePeriod=30 Mar 18 16:03:31 crc kubenswrapper[4939]: E0318 16:03:31.229286 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:03:31 crc kubenswrapper[4939]: E0318 16:03:31.231005 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:03:31 crc kubenswrapper[4939]: E0318 16:03:31.232484 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:03:31 crc kubenswrapper[4939]: E0318 16:03:31.232590 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerName="nova-scheduler-scheduler" Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.442986 4939 generic.go:334] "Generic (PLEG): container finished" podID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerID="d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746" exitCode=143 Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.443046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerDied","Data":"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746"} Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.450375 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerStarted","Data":"cefcf8484139882612cbf2acdccd98b003d26c82bc17372d2b00824a4b7ba550"} Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.450531 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-log" containerID="cri-o://51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b" gracePeriod=30 Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.450615 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-api" containerID="cri-o://6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7" gracePeriod=30 Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.458099 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": EOF" Mar 18 16:03:31 crc kubenswrapper[4939]: I0318 16:03:31.458128 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": EOF" Mar 18 16:03:32 crc kubenswrapper[4939]: I0318 16:03:32.461271 4939 generic.go:334] "Generic (PLEG): container finished" podID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerID="51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b" exitCode=143 Mar 18 16:03:32 crc kubenswrapper[4939]: I0318 16:03:32.461328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerDied","Data":"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b"} Mar 18 16:03:32 crc kubenswrapper[4939]: I0318 16:03:32.463647 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerStarted","Data":"bcaf1ad5a30e3e16337fdbd45932541ca349149870b4da32747001a8591240e1"} Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.421231 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.488220 4939 generic.go:334] "Generic (PLEG): container finished" podID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerID="71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3" exitCode=0 Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.488746 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.488778 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerDied","Data":"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3"} Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.488850 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2621205b-00d3-4e6b-a4fb-f9c87f14b438","Type":"ContainerDied","Data":"0134adece0a074802cb77e324312d3e76e25d3f6c26aafac79b0adad9dd2bde7"} Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.488875 4939 scope.go:117] "RemoveContainer" containerID="71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.496359 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerStarted","Data":"f3079f7fe5149bacaed20469d6bc740ba824c2bdc67fa400c6cfad0cecdc52c1"} Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.496707 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.504480 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs\") pod \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.504613 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs\") pod \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.504655 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlttg\" (UniqueName: \"kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg\") pod \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.504740 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data\") pod \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.504835 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle\") pod \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\" (UID: \"2621205b-00d3-4e6b-a4fb-f9c87f14b438\") " Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.506077 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs" (OuterVolumeSpecName: "logs") pod "2621205b-00d3-4e6b-a4fb-f9c87f14b438" (UID: "2621205b-00d3-4e6b-a4fb-f9c87f14b438"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.515136 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg" (OuterVolumeSpecName: "kube-api-access-jlttg") pod "2621205b-00d3-4e6b-a4fb-f9c87f14b438" (UID: "2621205b-00d3-4e6b-a4fb-f9c87f14b438"). InnerVolumeSpecName "kube-api-access-jlttg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.548927 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data" (OuterVolumeSpecName: "config-data") pod "2621205b-00d3-4e6b-a4fb-f9c87f14b438" (UID: "2621205b-00d3-4e6b-a4fb-f9c87f14b438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.550537 4939 scope.go:117] "RemoveContainer" containerID="d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.575750 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2621205b-00d3-4e6b-a4fb-f9c87f14b438" (UID: "2621205b-00d3-4e6b-a4fb-f9c87f14b438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.595056 4939 scope.go:117] "RemoveContainer" containerID="71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3" Mar 18 16:03:34 crc kubenswrapper[4939]: E0318 16:03:34.600087 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3\": container with ID starting with 71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3 not found: ID does not exist" containerID="71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.600141 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3"} err="failed to get container status \"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3\": rpc error: code = NotFound desc = could not find container \"71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3\": container with ID starting with 71782f0428304de6bb57d0c603f7d4793c5035ff920141154c6685e48d2533f3 not found: ID does not exist" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.600169 4939 scope.go:117] "RemoveContainer" containerID="d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746" Mar 18 16:03:34 crc kubenswrapper[4939]: E0318 16:03:34.600940 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746\": container with ID starting with d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746 not found: ID does not exist" containerID="d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.601006 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746"} err="failed to get container status \"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746\": rpc error: code = NotFound desc = could not find container \"d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746\": container with ID starting with d4af9b24a4e00abe53560ce6d58305bb26bf3dc667ba6fca6bf6485e0bb5d746 not found: ID does not exist" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.607658 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2621205b-00d3-4e6b-a4fb-f9c87f14b438-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.607694 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlttg\" (UniqueName: \"kubernetes.io/projected/2621205b-00d3-4e6b-a4fb-f9c87f14b438-kube-api-access-jlttg\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.607704 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.607713 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.608677 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2621205b-00d3-4e6b-a4fb-f9c87f14b438" (UID: "2621205b-00d3-4e6b-a4fb-f9c87f14b438"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.709449 4939 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2621205b-00d3-4e6b-a4fb-f9c87f14b438-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.817400 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.967857526 podStartE2EDuration="6.817384096s" podCreationTimestamp="2026-03-18 16:03:28 +0000 UTC" firstStartedPulling="2026-03-18 16:03:29.26826894 +0000 UTC m=+1573.867456561" lastFinishedPulling="2026-03-18 16:03:34.11779551 +0000 UTC m=+1578.716983131" observedRunningTime="2026-03-18 16:03:34.52722479 +0000 UTC m=+1579.126412411" watchObservedRunningTime="2026-03-18 16:03:34.817384096 +0000 UTC m=+1579.416571717" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.820346 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.835705 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.852747 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:34 crc kubenswrapper[4939]: E0318 16:03:34.853279 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb451f8-d612-468e-9514-80d063ea89e6" containerName="nova-manage" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853302 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb451f8-d612-468e-9514-80d063ea89e6" containerName="nova-manage" Mar 18 16:03:34 crc kubenswrapper[4939]: E0318 16:03:34.853322 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-log" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853332 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-log" Mar 18 16:03:34 crc kubenswrapper[4939]: E0318 16:03:34.853355 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-metadata" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853363 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-metadata" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853738 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-metadata" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853764 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" containerName="nova-metadata-log" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.853778 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb451f8-d612-468e-9514-80d063ea89e6" containerName="nova-manage" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.855045 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.857300 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.859781 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.863709 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.912053 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.912094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.912153 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.912202 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:34 crc kubenswrapper[4939]: I0318 16:03:34.912339 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcz5b\" (UniqueName: \"kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.014234 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.014281 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.014332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.014380 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.014416 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcz5b\" (UniqueName: \"kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.015056 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.018254 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.019531 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.019592 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.029203 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcz5b\" (UniqueName: \"kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b\") pod \"nova-metadata-0\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.175460 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.504048 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.506955 4939 generic.go:334] "Generic (PLEG): container finished" podID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" exitCode=0 Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.507017 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f552e3e-ab72-491d-a1e3-e44e86a523be","Type":"ContainerDied","Data":"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417"} Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.507041 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f552e3e-ab72-491d-a1e3-e44e86a523be","Type":"ContainerDied","Data":"30d7e758bc8d1892b57f27014f7c02764b92e5e6f90afe5ff29eed8586800f8a"} Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.507056 4939 scope.go:117] "RemoveContainer" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.507070 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.536104 4939 scope.go:117] "RemoveContainer" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" Mar 18 16:03:35 crc kubenswrapper[4939]: E0318 16:03:35.536487 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417\": container with ID starting with bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417 not found: ID does not exist" containerID="bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.536579 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417"} err="failed to get container status \"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417\": rpc error: code = NotFound desc = could not find container \"bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417\": container with ID starting with bf187f3cc5027b4911dfe221f226b830f9dd733a1c8b08fc47afbb8bcd763417 not found: ID does not exist" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.624582 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.626579 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data\") pod \"1f552e3e-ab72-491d-a1e3-e44e86a523be\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.626728 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jdc\" (UniqueName: \"kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc\") pod \"1f552e3e-ab72-491d-a1e3-e44e86a523be\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.626873 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle\") pod \"1f552e3e-ab72-491d-a1e3-e44e86a523be\" (UID: \"1f552e3e-ab72-491d-a1e3-e44e86a523be\") " Mar 18 16:03:35 crc kubenswrapper[4939]: W0318 16:03:35.629764 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71cb7a9_1ab5_4596_901f_314dcfae2bc4.slice/crio-e0ca74603485f889d9836ee5d92affacde08b2e5a2d3ff1f9faad774da6a5f5b WatchSource:0}: Error finding container e0ca74603485f889d9836ee5d92affacde08b2e5a2d3ff1f9faad774da6a5f5b: Status 404 returned error can't find the container with id e0ca74603485f889d9836ee5d92affacde08b2e5a2d3ff1f9faad774da6a5f5b Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.632948 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc" (OuterVolumeSpecName: "kube-api-access-w2jdc") pod "1f552e3e-ab72-491d-a1e3-e44e86a523be" (UID: "1f552e3e-ab72-491d-a1e3-e44e86a523be"). InnerVolumeSpecName "kube-api-access-w2jdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.658338 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data" (OuterVolumeSpecName: "config-data") pod "1f552e3e-ab72-491d-a1e3-e44e86a523be" (UID: "1f552e3e-ab72-491d-a1e3-e44e86a523be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.670975 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f552e3e-ab72-491d-a1e3-e44e86a523be" (UID: "1f552e3e-ab72-491d-a1e3-e44e86a523be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.729863 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.729906 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f552e3e-ab72-491d-a1e3-e44e86a523be-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.729921 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jdc\" (UniqueName: \"kubernetes.io/projected/1f552e3e-ab72-491d-a1e3-e44e86a523be-kube-api-access-w2jdc\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.858533 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.867810 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.877340 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:35 crc kubenswrapper[4939]: E0318 16:03:35.877926 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerName="nova-scheduler-scheduler" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.877957 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerName="nova-scheduler-scheduler" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.878149 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" containerName="nova-scheduler-scheduler" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.879260 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.881853 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.888557 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.937009 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.940740 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:35 crc kubenswrapper[4939]: I0318 16:03:35.940779 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ttw\" (UniqueName: \"kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.042641 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.042712 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ttw\" (UniqueName: \"kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.042781 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.046345 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.046916 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.060234 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ttw\" (UniqueName: \"kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw\") pod \"nova-scheduler-0\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.151899 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f552e3e-ab72-491d-a1e3-e44e86a523be" path="/var/lib/kubelet/pods/1f552e3e-ab72-491d-a1e3-e44e86a523be/volumes" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.152468 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2621205b-00d3-4e6b-a4fb-f9c87f14b438" path="/var/lib/kubelet/pods/2621205b-00d3-4e6b-a4fb-f9c87f14b438/volumes" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.207993 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.520622 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerStarted","Data":"a06a089fed14337403eec84f45e3b078fba61e58eeaa6ebe0c2a5e8ebeea031d"} Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.520946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerStarted","Data":"9935f89912bbc9ccc732826d2efbd0df5765843a5c1ca0671ee97d191011142c"} Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.520964 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerStarted","Data":"e0ca74603485f889d9836ee5d92affacde08b2e5a2d3ff1f9faad774da6a5f5b"} Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.544666 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.544479049 podStartE2EDuration="2.544479049s" podCreationTimestamp="2026-03-18 16:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:36.536327347 +0000 UTC m=+1581.135514968" watchObservedRunningTime="2026-03-18 16:03:36.544479049 +0000 UTC m=+1581.143666670" Mar 18 16:03:36 crc kubenswrapper[4939]: I0318 16:03:36.683547 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:03:36 crc kubenswrapper[4939]: W0318 16:03:36.686418 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5203f87_b63b_45f5_95e3_c536406909e5.slice/crio-f1f40ddab2f339abefe47bad5936e2cbdc133521e8c923299b22d6ab90193ebc WatchSource:0}: Error finding container f1f40ddab2f339abefe47bad5936e2cbdc133521e8c923299b22d6ab90193ebc: Status 404 returned error can't find the container with id f1f40ddab2f339abefe47bad5936e2cbdc133521e8c923299b22d6ab90193ebc Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.257530 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.366980 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.367039 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.367098 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.367117 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.367193 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.367326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhnc5\" (UniqueName: \"kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5\") pod \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\" (UID: \"c3410c1e-8e0a-4e7f-921b-439936fa77b8\") " Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.368629 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs" (OuterVolumeSpecName: "logs") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.371945 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5" (OuterVolumeSpecName: "kube-api-access-vhnc5") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "kube-api-access-vhnc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.397200 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data" (OuterVolumeSpecName: "config-data") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.401564 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.417333 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.420267 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3410c1e-8e0a-4e7f-921b-439936fa77b8" (UID: "c3410c1e-8e0a-4e7f-921b-439936fa77b8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469348 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469382 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469394 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469406 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhnc5\" (UniqueName: \"kubernetes.io/projected/c3410c1e-8e0a-4e7f-921b-439936fa77b8-kube-api-access-vhnc5\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469418 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3410c1e-8e0a-4e7f-921b-439936fa77b8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.469427 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3410c1e-8e0a-4e7f-921b-439936fa77b8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.530339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5203f87-b63b-45f5-95e3-c536406909e5","Type":"ContainerStarted","Data":"3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b"} Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.530391 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5203f87-b63b-45f5-95e3-c536406909e5","Type":"ContainerStarted","Data":"f1f40ddab2f339abefe47bad5936e2cbdc133521e8c923299b22d6ab90193ebc"} Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.532402 4939 generic.go:334] "Generic (PLEG): container finished" podID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerID="6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7" exitCode=0 Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.532455 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.532513 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerDied","Data":"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7"} Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.532540 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3410c1e-8e0a-4e7f-921b-439936fa77b8","Type":"ContainerDied","Data":"0eea09975d527be3af4cd123ea13b81f3873fe763fe50f9afd92866c27560aa0"} Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.532558 4939 scope.go:117] "RemoveContainer" containerID="6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.551776 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.55175246 podStartE2EDuration="2.55175246s" podCreationTimestamp="2026-03-18 16:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:37.548544429 +0000 UTC m=+1582.147732060" watchObservedRunningTime="2026-03-18 16:03:37.55175246 +0000 UTC m=+1582.150940081" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.568290 4939 scope.go:117] "RemoveContainer" containerID="51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.584720 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.601414 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.605062 4939 scope.go:117] "RemoveContainer" containerID="6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7" Mar 18 16:03:37 crc kubenswrapper[4939]: E0318 16:03:37.605498 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7\": container with ID starting with 6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7 not found: ID does not exist" containerID="6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.605549 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7"} err="failed to get container status \"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7\": rpc error: code = NotFound desc = could not find container \"6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7\": container with ID starting with 6fce77401798e49094b73135888e61c38e294b0eb944c42402436dfc249684b7 not found: ID does not exist" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.605576 4939 scope.go:117] "RemoveContainer" containerID="51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b" Mar 18 16:03:37 crc kubenswrapper[4939]: E0318 16:03:37.605978 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b\": container with ID starting with 51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b not found: ID does not exist" containerID="51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.606024 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b"} err="failed to get container status \"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b\": rpc error: code = NotFound desc = could not find container \"51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b\": container with ID starting with 51283a4d0e772adb83616d36630786687ef7a6fab2ac44bf8ceee434298d0b9b not found: ID does not exist" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.622487 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:37 crc kubenswrapper[4939]: E0318 16:03:37.623029 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-log" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.623052 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-log" Mar 18 16:03:37 crc kubenswrapper[4939]: E0318 16:03:37.623083 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-api" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.623092 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-api" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.623352 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-log" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.623379 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" containerName="nova-api-api" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.624515 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.628188 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.628303 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.628423 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.633793 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673208 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673305 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7sq\" (UniqueName: \"kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673395 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673476 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673528 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.673573 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775082 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775169 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775263 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7sq\" (UniqueName: \"kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775289 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775380 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.775421 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.776845 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.780145 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.780450 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.780937 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.781235 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.792717 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7sq\" (UniqueName: \"kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq\") pod \"nova-api-0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " pod="openstack/nova-api-0" Mar 18 16:03:37 crc kubenswrapper[4939]: I0318 16:03:37.948079 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:03:38 crc kubenswrapper[4939]: I0318 16:03:38.144353 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3410c1e-8e0a-4e7f-921b-439936fa77b8" path="/var/lib/kubelet/pods/c3410c1e-8e0a-4e7f-921b-439936fa77b8/volumes" Mar 18 16:03:38 crc kubenswrapper[4939]: I0318 16:03:38.431118 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:03:38 crc kubenswrapper[4939]: I0318 16:03:38.547784 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerStarted","Data":"c30659c15496ac9423c4d42b78ceb86e966224e3c7aeda97df07bde6a2936fd1"} Mar 18 16:03:39 crc kubenswrapper[4939]: I0318 16:03:39.562436 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerStarted","Data":"141032193e0acc0c48dcf5467fc8003161e9b27e1dbd08d3df9c676f4ba3145c"} Mar 18 16:03:39 crc kubenswrapper[4939]: I0318 16:03:39.562873 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerStarted","Data":"4607f5748ea35230c9f8cb3745bf14422c3e81e916ca3f8b80878128c7ed0bca"} Mar 18 16:03:39 crc kubenswrapper[4939]: I0318 16:03:39.596450 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.59642891 podStartE2EDuration="2.59642891s" podCreationTimestamp="2026-03-18 16:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:39.593803325 +0000 UTC m=+1584.192990986" watchObservedRunningTime="2026-03-18 16:03:39.59642891 +0000 UTC m=+1584.195616531" Mar 18 16:03:40 crc kubenswrapper[4939]: E0318 16:03:40.129140 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:03:41 crc kubenswrapper[4939]: I0318 16:03:41.208950 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:03:45 crc kubenswrapper[4939]: I0318 16:03:45.176234 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:03:45 crc kubenswrapper[4939]: I0318 16:03:45.177141 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:03:46 crc kubenswrapper[4939]: I0318 16:03:46.192723 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:46 crc kubenswrapper[4939]: I0318 16:03:46.192780 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:46 crc kubenswrapper[4939]: I0318 16:03:46.208496 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:03:46 crc kubenswrapper[4939]: I0318 16:03:46.277637 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:03:46 crc kubenswrapper[4939]: I0318 16:03:46.686971 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:03:47 crc kubenswrapper[4939]: I0318 16:03:47.949371 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:47 crc kubenswrapper[4939]: I0318 16:03:47.949664 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:03:48 crc kubenswrapper[4939]: I0318 16:03:48.959642 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:48 crc kubenswrapper[4939]: I0318 16:03:48.959678 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:03:50 crc kubenswrapper[4939]: E0318 16:03:50.366695 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:03:53 crc kubenswrapper[4939]: I0318 16:03:53.175629 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:03:53 crc kubenswrapper[4939]: I0318 16:03:53.175925 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.181804 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.184078 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.188765 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.749411 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.948895 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:03:55 crc kubenswrapper[4939]: I0318 16:03:55.948955 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:03:57 crc kubenswrapper[4939]: I0318 16:03:57.957424 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:03:57 crc kubenswrapper[4939]: I0318 16:03:57.958159 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:03:57 crc kubenswrapper[4939]: I0318 16:03:57.968287 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:03:58 crc kubenswrapper[4939]: I0318 16:03:58.779209 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:03:58 crc kubenswrapper[4939]: I0318 16:03:58.842460 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.149701 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564164-8ldlb"] Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.151292 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.153318 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.153895 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.154066 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.164902 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-8ldlb"] Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.252888 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4jb\" (UniqueName: \"kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb\") pod \"auto-csr-approver-29564164-8ldlb\" (UID: \"c790a937-6999-4aef-adb0-d7ff057c7e03\") " pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.354986 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4jb\" (UniqueName: \"kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb\") pod \"auto-csr-approver-29564164-8ldlb\" (UID: \"c790a937-6999-4aef-adb0-d7ff057c7e03\") " pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.376969 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4jb\" (UniqueName: \"kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb\") pod \"auto-csr-approver-29564164-8ldlb\" (UID: \"c790a937-6999-4aef-adb0-d7ff057c7e03\") " pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.469736 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:00 crc kubenswrapper[4939]: E0318 16:04:00.641464 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:04:00 crc kubenswrapper[4939]: I0318 16:04:00.976341 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-8ldlb"] Mar 18 16:04:00 crc kubenswrapper[4939]: W0318 16:04:00.983483 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc790a937_6999_4aef_adb0_d7ff057c7e03.slice/crio-6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b WatchSource:0}: Error finding container 6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b: Status 404 returned error can't find the container with id 6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b Mar 18 16:04:01 crc kubenswrapper[4939]: I0318 16:04:01.799927 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" event={"ID":"c790a937-6999-4aef-adb0-d7ff057c7e03","Type":"ContainerStarted","Data":"6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b"} Mar 18 16:04:02 crc kubenswrapper[4939]: I0318 16:04:02.810519 4939 generic.go:334] "Generic (PLEG): container finished" podID="c790a937-6999-4aef-adb0-d7ff057c7e03" containerID="3d6d83ff55d826e4e02a854aa36ac39355d6aaebfc2ea2f89b42123e1185987e" exitCode=0 Mar 18 16:04:02 crc kubenswrapper[4939]: I0318 16:04:02.810765 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" event={"ID":"c790a937-6999-4aef-adb0-d7ff057c7e03","Type":"ContainerDied","Data":"3d6d83ff55d826e4e02a854aa36ac39355d6aaebfc2ea2f89b42123e1185987e"} Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.170586 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.230155 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk4jb\" (UniqueName: \"kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb\") pod \"c790a937-6999-4aef-adb0-d7ff057c7e03\" (UID: \"c790a937-6999-4aef-adb0-d7ff057c7e03\") " Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.236028 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb" (OuterVolumeSpecName: "kube-api-access-lk4jb") pod "c790a937-6999-4aef-adb0-d7ff057c7e03" (UID: "c790a937-6999-4aef-adb0-d7ff057c7e03"). InnerVolumeSpecName "kube-api-access-lk4jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.332876 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk4jb\" (UniqueName: \"kubernetes.io/projected/c790a937-6999-4aef-adb0-d7ff057c7e03-kube-api-access-lk4jb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.842560 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" event={"ID":"c790a937-6999-4aef-adb0-d7ff057c7e03","Type":"ContainerDied","Data":"6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b"} Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.842599 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8d9e5829cd13980df03c6febd1ce06c08b34fbfe4ef30d3310ee9849c6294b" Mar 18 16:04:04 crc kubenswrapper[4939]: I0318 16:04:04.842610 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-8ldlb" Mar 18 16:04:05 crc kubenswrapper[4939]: I0318 16:04:05.242656 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-gb9ms"] Mar 18 16:04:05 crc kubenswrapper[4939]: I0318 16:04:05.254028 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-gb9ms"] Mar 18 16:04:06 crc kubenswrapper[4939]: I0318 16:04:06.147477 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2982d26-e360-4ad8-a88f-46fd4f87b1eb" path="/var/lib/kubelet/pods/a2982d26-e360-4ad8-a88f-46fd4f87b1eb/volumes" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.062251 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:07 crc kubenswrapper[4939]: E0318 16:04:07.062870 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c790a937-6999-4aef-adb0-d7ff057c7e03" containerName="oc" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.062888 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c790a937-6999-4aef-adb0-d7ff057c7e03" containerName="oc" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.063143 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c790a937-6999-4aef-adb0-d7ff057c7e03" containerName="oc" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.064900 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.087447 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.088207 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6dv\" (UniqueName: \"kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.088240 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.088303 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.189807 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6dv\" (UniqueName: \"kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.189885 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.189981 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.191043 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.191274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.210031 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6dv\" (UniqueName: \"kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv\") pod \"certified-operators-prjpr\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.387935 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:07 crc kubenswrapper[4939]: I0318 16:04:07.898128 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:07 crc kubenswrapper[4939]: W0318 16:04:07.902148 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcefea5f4_f44f_4e73_b762_38adc00ce1eb.slice/crio-088a7d35d6356d60ae85c5d94d6c33f63fb1cd99bec749f0c0460f506cf6461d WatchSource:0}: Error finding container 088a7d35d6356d60ae85c5d94d6c33f63fb1cd99bec749f0c0460f506cf6461d: Status 404 returned error can't find the container with id 088a7d35d6356d60ae85c5d94d6c33f63fb1cd99bec749f0c0460f506cf6461d Mar 18 16:04:08 crc kubenswrapper[4939]: I0318 16:04:08.885368 4939 generic.go:334] "Generic (PLEG): container finished" podID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerID="91ef614c09ef3a14bffbbfbb3efbb00ccbda43fbf6cdd41c6ff512bbb4203e93" exitCode=0 Mar 18 16:04:08 crc kubenswrapper[4939]: I0318 16:04:08.885413 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerDied","Data":"91ef614c09ef3a14bffbbfbb3efbb00ccbda43fbf6cdd41c6ff512bbb4203e93"} Mar 18 16:04:08 crc kubenswrapper[4939]: I0318 16:04:08.885655 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerStarted","Data":"088a7d35d6356d60ae85c5d94d6c33f63fb1cd99bec749f0c0460f506cf6461d"} Mar 18 16:04:09 crc kubenswrapper[4939]: I0318 16:04:09.894747 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerStarted","Data":"f1400e5882a916afb311d9b971d5a0db2f34c1f124fd886dfbc59239d21493f5"} Mar 18 16:04:10 crc kubenswrapper[4939]: E0318 16:04:10.871212 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a123209_990a_4fcc_a66a_967ab7007653.slice/crio-95ca108c2b5351191081daeead44ad9905ba0e68191e774c73faaa3584cd5e67\": RecentStats: unable to find data in memory cache]" Mar 18 16:04:10 crc kubenswrapper[4939]: I0318 16:04:10.907752 4939 generic.go:334] "Generic (PLEG): container finished" podID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerID="f1400e5882a916afb311d9b971d5a0db2f34c1f124fd886dfbc59239d21493f5" exitCode=0 Mar 18 16:04:10 crc kubenswrapper[4939]: I0318 16:04:10.907799 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerDied","Data":"f1400e5882a916afb311d9b971d5a0db2f34c1f124fd886dfbc59239d21493f5"} Mar 18 16:04:11 crc kubenswrapper[4939]: I0318 16:04:11.917459 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerStarted","Data":"006bb20fbcdd84b51be7b526c598c7ecd52ed39ad5a155207701e52879954474"} Mar 18 16:04:11 crc kubenswrapper[4939]: I0318 16:04:11.941537 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-prjpr" podStartSLOduration=2.506242091 podStartE2EDuration="4.941491675s" podCreationTimestamp="2026-03-18 16:04:07 +0000 UTC" firstStartedPulling="2026-03-18 16:04:08.890975676 +0000 UTC m=+1613.490163297" lastFinishedPulling="2026-03-18 16:04:11.32622527 +0000 UTC m=+1615.925412881" observedRunningTime="2026-03-18 16:04:11.931224793 +0000 UTC m=+1616.530412424" watchObservedRunningTime="2026-03-18 16:04:11.941491675 +0000 UTC m=+1616.540679296" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.023775 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.024900 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="08510a7a-ad57-44a4-9089-7558c213284b" containerName="openstackclient" containerID="cri-o://3e958f60a62d00010cc8b06118e35abed1735c3a72b9d800113fbbaacdb1ee62" gracePeriod=2 Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.103650 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.133220 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.133643 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08510a7a-ad57-44a4-9089-7558c213284b" containerName="openstackclient" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.133655 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="08510a7a-ad57-44a4-9089-7558c213284b" containerName="openstackclient" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.133852 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="08510a7a-ad57-44a4-9089-7558c213284b" containerName="openstackclient" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.134443 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.153293 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.155658 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.155925 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78tlr\" (UniqueName: \"kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.191864 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-409c-account-create-update-qhm94"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.218463 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-409c-account-create-update-qhm94"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.237443 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.251853 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8kz2r"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.267017 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8kz2r"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.268756 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78tlr\" (UniqueName: \"kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.268992 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.272455 4939 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.272594 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:17.772559844 +0000 UTC m=+1622.371747635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.273209 4939 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.273253 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:17.773240183 +0000 UTC m=+1622.372427994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-scripts" not found Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.274982 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.298656 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.300328 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.306310 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.314316 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78tlr\" (UniqueName: \"kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr\") pod \"nova-api-409c-account-create-update-m2tw5\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.330757 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.370877 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.371689 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmb9s\" (UniqueName: \"kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.392935 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.393055 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.395809 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.396359 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="ovn-northd" containerID="cri-o://0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" gracePeriod=30 Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.397255 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="openstack-network-exporter" containerID="cri-o://c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e" gracePeriod=30 Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.477802 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.478021 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmb9s\" (UniqueName: \"kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.479303 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.499725 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.525812 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.542227 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmb9s\" (UniqueName: \"kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s\") pod \"root-account-create-update-ljqrx\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.550203 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.552386 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.572756 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.685915 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4z5\" (UniqueName: \"kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.686313 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.688612 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.729779 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-aa27-account-create-update-zrtw8"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.742418 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.787000 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.789247 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.799122 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.802029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4z5\" (UniqueName: \"kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.802291 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.808518 4939 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.808614 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:18.808583409 +0000 UTC m=+1623.407771030 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-scripts" not found Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.809941 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.810413 4939 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.810455 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:18.810443052 +0000 UTC m=+1623.409630663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.810487 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: E0318 16:04:17.810520 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data podName:d850ac81-a29e-4e93-9fab-72b6325de52e nodeName:}" failed. No retries permitted until 2026-03-18 16:04:18.310514654 +0000 UTC m=+1622.909702275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data") pod "rabbitmq-cell1-server-0" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e") : configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.835239 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-aa27-account-create-update-zrtw8"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.864733 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4z5\" (UniqueName: \"kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5\") pod \"placement-aa27-account-create-update-m8m54\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.922490 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.924197 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.934804 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 16:04:17 crc kubenswrapper[4939]: I0318 16:04:17.979067 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.009859 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.009941 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trd4x\" (UniqueName: \"kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.074614 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.116722 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zwm\" (UniqueName: \"kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.116855 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.116930 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.117000 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trd4x\" (UniqueName: \"kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.118125 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.222200 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zwm\" (UniqueName: \"kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.225325 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.228996 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trd4x\" (UniqueName: \"kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x\") pod \"neutron-ac46-account-create-update-fqfvp\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.235567 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.288204 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zwm\" (UniqueName: \"kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm\") pod \"nova-cell0-8faa-account-create-update-ztzzs\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.352802 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.352867 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data podName:d850ac81-a29e-4e93-9fab-72b6325de52e nodeName:}" failed. No retries permitted until 2026-03-18 16:04:19.352843456 +0000 UTC m=+1623.952031077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data") pod "rabbitmq-cell1-server-0" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e") : configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.559488 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.595337 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ff3991-7d14-48ca-ae7b-049eba4736e4" path="/var/lib/kubelet/pods/21ff3991-7d14-48ca-ae7b-049eba4736e4/volumes" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.602952 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.678649 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706a1ac1-f38c-4710-a889-b2799e52a652" path="/var/lib/kubelet/pods/706a1ac1-f38c-4710-a889-b2799e52a652/volumes" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.714772 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d301c5c5-c1cf-4037-aafb-612fbbe133f7" path="/var/lib/kubelet/pods/d301c5c5-c1cf-4037-aafb-612fbbe133f7/volumes" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.715611 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.715637 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.715659 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.715675 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.715693 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.720712 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="openstack-network-exporter" containerID="cri-o://4e455ae5d5a3238bfc635b57f221285060c55f8f3b0f69f228d7592bc6b0442d" gracePeriod=300 Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.738291 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.738361 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-b75kg"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.738383 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-b75kg"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.738402 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-h5sth"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.738570 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.742329 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-h5sth"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.742403 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ac46-account-create-update-wf798"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.742494 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.752305 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ac46-account-create-update-wf798"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.756105 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.756453 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.781268 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-94b1-account-create-update-pv4z6"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.810856 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-94b1-account-create-update-pv4z6"] Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.815419 4939 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.815519 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:20.815485499 +0000 UTC m=+1625.414673120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-config-data" not found Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.815866 4939 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 18 16:04:18 crc kubenswrapper[4939]: E0318 16:04:18.815957 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:20.815937851 +0000 UTC m=+1625.415125472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-scripts" not found Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.898529 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.899494 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="openstack-network-exporter" containerID="cri-o://16ff00962bb2fdbc01658731042bb863bfb945a5da3765b64fe956c1721303df" gracePeriod=300 Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.919723 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.919777 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbb4\" (UniqueName: \"kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.919835 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.920685 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:18 crc kubenswrapper[4939]: I0318 16:04:18.967163 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ms4t8"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.001627 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ms4t8"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.002993 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="ovsdbserver-sb" containerID="cri-o://42c99df8c4f51d15393e20d51c9feb5cd6360994ff5aa297b56c8b97ec9c449e" gracePeriod=300 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.016492 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="ovsdbserver-nb" containerID="cri-o://9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" gracePeriod=300 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.023495 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.023584 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbb4\" (UniqueName: \"kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.023687 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.023734 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.024619 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.025330 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.025412 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:19.525389717 +0000 UTC m=+1624.124577338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.027802 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86d4-account-create-update-ztlll"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.037188 4939 projected.go:194] Error preparing data for projected volume kube-api-access-2lkr5 for pod openstack/nova-cell1-f9fa-account-create-update-h5sth: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.037262 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5 podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:19.53724207 +0000 UTC m=+1624.136429691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2lkr5" (UniqueName: "kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.070435 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86d4-account-create-update-ztlll"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.095305 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbb4\" (UniqueName: \"kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4\") pod \"barbican-86d4-account-create-update-795z4\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.151185 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-29ff-account-create-update-vj982"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.190275 4939 generic.go:334] "Generic (PLEG): container finished" podID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerID="c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e" exitCode=2 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.190388 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerDied","Data":"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e"} Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.192677 4939 generic.go:334] "Generic (PLEG): container finished" podID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerID="4e455ae5d5a3238bfc635b57f221285060c55f8f3b0f69f228d7592bc6b0442d" exitCode=2 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.192725 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerDied","Data":"4e455ae5d5a3238bfc635b57f221285060c55f8f3b0f69f228d7592bc6b0442d"} Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.222120 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qkn72"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.227917 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.282927 4939 generic.go:334] "Generic (PLEG): container finished" podID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerID="16ff00962bb2fdbc01658731042bb863bfb945a5da3765b64fe956c1721303df" exitCode=2 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.284106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerDied","Data":"16ff00962bb2fdbc01658731042bb863bfb945a5da3765b64fe956c1721303df"} Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.312609 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-29ff-account-create-update-vj982"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.406396 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.433044 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qkn72"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.438013 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.438064 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data podName:d850ac81-a29e-4e93-9fab-72b6325de52e nodeName:}" failed. No retries permitted until 2026-03-18 16:04:21.438049225 +0000 UTC m=+1626.037236846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data") pod "rabbitmq-cell1-server-0" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e") : configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.438802 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.442134 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.442195 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="ovn-northd" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.519992 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-hcnbj"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.539741 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.540187 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.540361 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.540420 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:20.540401811 +0000 UTC m=+1625.139589432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.547172 4939 projected.go:194] Error preparing data for projected volume kube-api-access-2lkr5 for pod openstack/nova-cell1-f9fa-account-create-update-h5sth: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.547258 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5 podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:20.547240154 +0000 UTC m=+1625.146427775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2lkr5" (UniqueName: "kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.586494 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-hcnbj"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.605553 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:19 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: if [ -n "" ]; then Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="" Mar 18 16:04:19 crc kubenswrapper[4939]: else Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:19 crc kubenswrapper[4939]: fi Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:19 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:19 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:19 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:19 crc kubenswrapper[4939]: # support updates Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.607853 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-ljqrx" podUID="f7a5f60f-451f-45ca-ad9d-62dc13bccf66" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.609567 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pxrk6"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.616604 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:19 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: if [ -n "nova_api" ]; then Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="nova_api" Mar 18 16:04:19 crc kubenswrapper[4939]: else Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:19 crc kubenswrapper[4939]: fi Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:19 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:19 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:19 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:19 crc kubenswrapper[4939]: # support updates Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.618423 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-409c-account-create-update-m2tw5" podUID="77b48752-eea3-4627-8da2-737f8bd7b36a" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.627426 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pxrk6"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.649464 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sss6f"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.674587 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sss6f"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.698394 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.698918 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-server" containerID="cri-o://91a292f6fe7a26c5c29f8f381a1c70bc1a4b7445389fcb1c53cbb76f807c045d" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699349 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="swift-recon-cron" containerID="cri-o://e9186d27f8cbffc2e014f48debc8242aa5db203078eb8960a71ebcd03018d88d" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699413 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-auditor" containerID="cri-o://43de3851a996ae5fb148b392f668b55f5e52a20759062bbd85dc2119439767c6" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699428 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="rsync" containerID="cri-o://17b37072545ad8d250412a3ed598381f9883f45412fd6cc5eb64b9e9b471819c" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699402 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-updater" containerID="cri-o://81945648d6ccfe19dbfc6da6c8d0e335483a55dcd4b1766b29052a5ed772cad1" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699492 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-replicator" containerID="cri-o://f363212ee4f25a7395df3d0d667028ad88acc708868cd9d1bc2c2e84543c3dda" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699563 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-expirer" containerID="cri-o://57c5515f8a5b3530a17121b168d8678bb05de6d5ca4a2d702308291f1fcd2d80" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699578 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-reaper" containerID="cri-o://eb4726099480d94bfcadfcc8ac5e8c7e0a22e0445d91b54b8bd947be6096f473" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699565 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-server" containerID="cri-o://f258992948b744aaf9d67e3c6a706143ae30356b748f5de908345ee552ac4c49" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699616 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-auditor" containerID="cri-o://8577f19335a709c20a6140281f93e100fa6126302c44206e5de996f40595a70b" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699630 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-updater" containerID="cri-o://6516dfe8b15001572d08172fcd22e95cd68f78a081a555cea105c5a94f42e2b7" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699667 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-auditor" containerID="cri-o://49a4e678ca0aa2dc3a78eb3ed7a1fd937781bddd10e5dd6b8eefa87915967bf0" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699676 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-replicator" containerID="cri-o://d54c7d302d6af9bea4dac8164b0ce249c3aa366a90bdd01e9a0627f62c76b69d" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699702 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-replicator" containerID="cri-o://5c713ae91799f792040bf3d961ee93903e389af2820041b641974aeae0138dc0" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.699751 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-server" containerID="cri-o://d04e69f4c87e17dbd5c2a71159726ed1fbba2fdd619b794a36df627a482cb171" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.706445 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-qvp84"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.718938 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-qvp84"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.741208 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-d6b5l"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.751777 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-d6b5l"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.765749 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mghgn"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.780538 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mghgn"] Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.795250 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:19 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: if [ -n "placement" ]; then Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="placement" Mar 18 16:04:19 crc kubenswrapper[4939]: else Mar 18 16:04:19 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:19 crc kubenswrapper[4939]: fi Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:19 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:19 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:19 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:19 crc kubenswrapper[4939]: # support updates Mar 18 16:04:19 crc kubenswrapper[4939]: Mar 18 16:04:19 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:19 crc kubenswrapper[4939]: E0318 16:04:19.797009 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-aa27-account-create-update-m8m54" podUID="409450a4-dba2-432c-8c4b-0cc14057937d" Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.837609 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.837931 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-w8b45" podUID="d66d88bf-9a85-4958-a731-258e55b7ae99" containerName="openstack-network-exporter" containerID="cri-o://8c56802c6a7daa3601b078a23b4cc0855237da751647d5e5c060ea419f09f0f9" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.874770 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-25g5r"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.902247 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-25g5r"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.922357 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.936070 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.954448 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.971268 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.971552 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="dnsmasq-dns" containerID="cri-o://4aa3a3cedd5d1ba6985398a8cde097556bb47737cbb52dca79f049e1751ffcc0" gracePeriod=10 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.982895 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.983212 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-576956754b-kspq2" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-log" containerID="cri-o://c4b1ae5bcdd7929e16516d94f1d8e93e3c240e0dba2fbbe3ab7b4b2d344bbbb5" gracePeriod=30 Mar 18 16:04:19 crc kubenswrapper[4939]: I0318 16:04:19.983759 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-576956754b-kspq2" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-api" containerID="cri-o://ee0d7673467ab34937d02099e23c0b13b6599f673de277c77448b47b6c8d53d7" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.016582 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.016825 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d7467855-ml675" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-api" containerID="cri-o://57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.017032 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d7467855-ml675" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-httpd" containerID="cri-o://e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.039858 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.040120 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="cinder-scheduler" containerID="cri-o://abca7abb8af59dd80a723a07cc0e9834ccc7d6891529372d20c592078a41f166" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.040562 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="probe" containerID="cri-o://c87e18d10fc916c7b05bb350ccbb835b08683349a3ae8b07119f660954350f76" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.064697 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.065035 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data podName:26f60b5c-7d32-4fea-b3ca-a8132f3ed026 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:20.565012096 +0000 UTC m=+1625.164199797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data") pod "rabbitmq-server-0" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026") : configmap "rabbitmq-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.092623 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.107573 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.107849 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api-log" containerID="cri-o://aba281f036d2b5e4f340ee24dde5ab36bf1f763f7ca8377787fe8cdc8c7f40da" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.108354 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" containerID="cri-o://ce5cff00a735d1a1ef2cc4c115a9ac8265f9778ce6a2da6ac270a33f9cc7daf5" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.111919 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.112225 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-log" containerID="cri-o://d1edf6d434d6afaad7a360d12012a796e03fa6b5275394e5c67c30d4f75cf0c3" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.112403 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-httpd" containerID="cri-o://fe8d4154643530aaa76761b4600cb2e025a90e8cdab5159d3a196d0d9c425f73" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.157402 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca6e14d-75ec-40af-9670-c413af1391df" path="/var/lib/kubelet/pods/0ca6e14d-75ec-40af-9670-c413af1391df/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.159088 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10912abd-378d-4dd0-abf1-092a5e7d7043" path="/var/lib/kubelet/pods/10912abd-378d-4dd0-abf1-092a5e7d7043/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.159713 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb451f8-d612-468e-9514-80d063ea89e6" path="/var/lib/kubelet/pods/1eb451f8-d612-468e-9514-80d063ea89e6/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.160269 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d904184-0bae-4dee-b7ce-b5e315763287" path="/var/lib/kubelet/pods/3d904184-0bae-4dee-b7ce-b5e315763287/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.161477 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e6de1a-22a0-4166-9bf0-f8844e3e89c2" path="/var/lib/kubelet/pods/59e6de1a-22a0-4166-9bf0-f8844e3e89c2/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.167344 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b" path="/var/lib/kubelet/pods/5a2e5b2f-7ea3-4f6e-b5d8-446cb2d2d24b/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.171604 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6d01d9-17ba-47c1-8251-3f37cc126f2e" path="/var/lib/kubelet/pods/5f6d01d9-17ba-47c1-8251-3f37cc126f2e/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.173847 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be63e5e-e444-4c39-adac-35698d2bb045" path="/var/lib/kubelet/pods/6be63e5e-e444-4c39-adac-35698d2bb045/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.174456 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a302eb7-0f61-497d-96df-59aacc8d463f" path="/var/lib/kubelet/pods/7a302eb7-0f61-497d-96df-59aacc8d463f/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.175254 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d40f23c-fece-48e9-a70f-7b1309600baa" path="/var/lib/kubelet/pods/7d40f23c-fece-48e9-a70f-7b1309600baa/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.176721 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86" path="/var/lib/kubelet/pods/bcfa8252-dbf4-4a2a-aab3-5e0d966e5f86/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.177427 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c18dc2-e9c2-4d01-b076-207c4c21eb12" path="/var/lib/kubelet/pods/e9c18dc2-e9c2-4d01-b076-207c4c21eb12/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.178447 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac75776-4245-474d-89c7-7002645a64c5" path="/var/lib/kubelet/pods/eac75776-4245-474d-89c7-7002645a64c5/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.179162 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed545871-ed70-4d38-830a-8a6131455769" path="/var/lib/kubelet/pods/ed545871-ed70-4d38-830a-8a6131455769/volumes" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.189332 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-c29v8"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.195458 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-c29v8"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.213363 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.213705 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-log" containerID="cri-o://6e356ce75b4202de8978925650d9357e6b5b88d4f5afbb33cbd95353ff874608" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.214266 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-httpd" containerID="cri-o://f026dc909ac22d1539817ec6b0e34f51819c81cb961582e5bb0f18556dfcbf46" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.215729 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d is running failed: container process not found" containerID="9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.219451 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d is running failed: container process not found" containerID="9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.224409 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hlnxc"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.230831 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d is running failed: container process not found" containerID="9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.230903 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="ovsdbserver-nb" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.231793 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.241283 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hlnxc"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.255701 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fb9n8"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.257754 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fb9n8"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.273562 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2ncc9"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.292282 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2ncc9"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.300614 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.310070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-m8m54" event={"ID":"409450a4-dba2-432c-8c4b-0cc14057937d","Type":"ContainerStarted","Data":"c2a9113061bb2638edde5ab0d85837eba0118f4118fdc4c195c2715ef044bbe5"} Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.315258 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: if [ -n "placement" ]; then Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="placement" Mar 18 16:04:20 crc kubenswrapper[4939]: else Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:20 crc kubenswrapper[4939]: fi Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:20 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:20 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:20 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:20 crc kubenswrapper[4939]: # support updates Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.316357 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-aa27-account-create-update-m8m54" podUID="409450a4-dba2-432c-8c4b-0cc14057937d" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.317394 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.337578 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.337867 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bb7666d55-9qg76" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" containerID="cri-o://c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.338037 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bb7666d55-9qg76" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-server" containerID="cri-o://8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.346651 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.347397 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-log" containerID="cri-o://9935f89912bbc9ccc732826d2efbd0df5765843a5c1ca0671ee97d191011142c" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.349121 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-metadata" containerID="cri-o://a06a089fed14337403eec84f45e3b078fba61e58eeaa6ebe0c2a5e8ebeea031d" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370768 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="17b37072545ad8d250412a3ed598381f9883f45412fd6cc5eb64b9e9b471819c" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370801 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="57c5515f8a5b3530a17121b168d8678bb05de6d5ca4a2d702308291f1fcd2d80" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370813 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="6516dfe8b15001572d08172fcd22e95cd68f78a081a555cea105c5a94f42e2b7" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370821 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="49a4e678ca0aa2dc3a78eb3ed7a1fd937781bddd10e5dd6b8eefa87915967bf0" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370830 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="5c713ae91799f792040bf3d961ee93903e389af2820041b641974aeae0138dc0" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370839 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="d04e69f4c87e17dbd5c2a71159726ed1fbba2fdd619b794a36df627a482cb171" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370847 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="81945648d6ccfe19dbfc6da6c8d0e335483a55dcd4b1766b29052a5ed772cad1" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370856 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="43de3851a996ae5fb148b392f668b55f5e52a20759062bbd85dc2119439767c6" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370865 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="f363212ee4f25a7395df3d0d667028ad88acc708868cd9d1bc2c2e84543c3dda" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370875 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="eb4726099480d94bfcadfcc8ac5e8c7e0a22e0445d91b54b8bd947be6096f473" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370884 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="8577f19335a709c20a6140281f93e100fa6126302c44206e5de996f40595a70b" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370891 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="d54c7d302d6af9bea4dac8164b0ce249c3aa366a90bdd01e9a0627f62c76b69d" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370964 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"17b37072545ad8d250412a3ed598381f9883f45412fd6cc5eb64b9e9b471819c"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.370995 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"57c5515f8a5b3530a17121b168d8678bb05de6d5ca4a2d702308291f1fcd2d80"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371030 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"6516dfe8b15001572d08172fcd22e95cd68f78a081a555cea105c5a94f42e2b7"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371043 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"49a4e678ca0aa2dc3a78eb3ed7a1fd937781bddd10e5dd6b8eefa87915967bf0"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371056 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"5c713ae91799f792040bf3d961ee93903e389af2820041b641974aeae0138dc0"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371067 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"d04e69f4c87e17dbd5c2a71159726ed1fbba2fdd619b794a36df627a482cb171"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371077 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"81945648d6ccfe19dbfc6da6c8d0e335483a55dcd4b1766b29052a5ed772cad1"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371087 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"43de3851a996ae5fb148b392f668b55f5e52a20759062bbd85dc2119439767c6"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371097 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"f363212ee4f25a7395df3d0d667028ad88acc708868cd9d1bc2c2e84543c3dda"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371108 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"eb4726099480d94bfcadfcc8ac5e8c7e0a22e0445d91b54b8bd947be6096f473"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"8577f19335a709c20a6140281f93e100fa6126302c44206e5de996f40595a70b"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.371130 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"d54c7d302d6af9bea4dac8164b0ce249c3aa366a90bdd01e9a0627f62c76b69d"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.390424 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.390894 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-log" containerID="cri-o://4607f5748ea35230c9f8cb3745bf14422c3e81e916ca3f8b80878128c7ed0bca" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.391287 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-api" containerID="cri-o://141032193e0acc0c48dcf5467fc8003161e9b27e1dbd08d3df9c676f4ba3145c" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.399637 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.437782 4939 generic.go:334] "Generic (PLEG): container finished" podID="08510a7a-ad57-44a4-9089-7558c213284b" containerID="3e958f60a62d00010cc8b06118e35abed1735c3a72b9d800113fbbaacdb1ee62" exitCode=137 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.448918 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0aa357a6-3028-4413-b384-0cbf6488f7ef/ovsdbserver-nb/0.log" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.448960 4939 generic.go:334] "Generic (PLEG): container finished" podID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerID="9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" exitCode=143 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.449015 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerDied","Data":"9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.456461 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.457104 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-409c-account-create-update-m2tw5" event={"ID":"77b48752-eea3-4627-8da2-737f8bd7b36a","Type":"ContainerStarted","Data":"fb87efe8d584162abc79b46c3d88ff6a47fa0dfd2c42d747763515b32dea55b8"} Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.465062 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: if [ -n "nova_api" ]; then Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="nova_api" Mar 18 16:04:20 crc kubenswrapper[4939]: else Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:20 crc kubenswrapper[4939]: fi Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:20 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:20 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:20 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:20 crc kubenswrapper[4939]: # support updates Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.467015 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-409c-account-create-update-m2tw5" podUID="77b48752-eea3-4627-8da2-737f8bd7b36a" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.479849 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.491942 4939 generic.go:334] "Generic (PLEG): container finished" podID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerID="4aa3a3cedd5d1ba6985398a8cde097556bb47737cbb52dca79f049e1751ffcc0" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.492057 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" event={"ID":"be388cde-0dc7-4b42-a62c-f790b70391c6","Type":"ContainerDied","Data":"4aa3a3cedd5d1ba6985398a8cde097556bb47737cbb52dca79f049e1751ffcc0"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.548049 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-xwzl9"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.567775 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e964ed3-1c22-4d0b-b6eb-45df177b2f33/ovsdbserver-sb/0.log" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.567827 4939 generic.go:334] "Generic (PLEG): container finished" podID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerID="42c99df8c4f51d15393e20d51c9feb5cd6360994ff5aa297b56c8b97ec9c449e" exitCode=143 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.567914 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerDied","Data":"42c99df8c4f51d15393e20d51c9feb5cd6360994ff5aa297b56c8b97ec9c449e"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.615544 4939 generic.go:334] "Generic (PLEG): container finished" podID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerID="e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa" exitCode=0 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.615649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerDied","Data":"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629043 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config\") pod \"08510a7a-ad57-44a4-9089-7558c213284b\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629181 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle\") pod \"08510a7a-ad57-44a4-9089-7558c213284b\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629246 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret\") pod \"08510a7a-ad57-44a4-9089-7558c213284b\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629277 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzjm\" (UniqueName: \"kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm\") pod \"08510a7a-ad57-44a4-9089-7558c213284b\" (UID: \"08510a7a-ad57-44a4-9089-7558c213284b\") " Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629853 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.629962 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.630125 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.630170 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:22.630157579 +0000 UTC m=+1627.229345200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.631073 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.631152 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data podName:26f60b5c-7d32-4fea-b3ca-a8132f3ed026 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:21.631121256 +0000 UTC m=+1626.230308877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data") pod "rabbitmq-server-0" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026") : configmap "rabbitmq-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.636872 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-s2smr"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.643147 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-h5sth"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.644451 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2lkr5 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" podUID="6884ec47-c51b-49dd-8b73-593328a782fe" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.654357 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-xwzl9"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.659161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ljqrx" event={"ID":"f7a5f60f-451f-45ca-ad9d-62dc13bccf66","Type":"ContainerStarted","Data":"b0038580c011bc3f543685811c4a607d8018f0e6d1a763d9ca6d99b790f313bc"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.659971 4939 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-ljqrx" secret="" err="secret \"galera-openstack-cell1-dockercfg-5pfng\" not found" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.661306 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: if [ -n "neutron" ]; then Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="neutron" Mar 18 16:04:20 crc kubenswrapper[4939]: else Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:20 crc kubenswrapper[4939]: fi Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:20 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:20 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:20 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:20 crc kubenswrapper[4939]: # support updates Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.674936 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm" (OuterVolumeSpecName: "kube-api-access-mmzjm") pod "08510a7a-ad57-44a4-9089-7558c213284b" (UID: "08510a7a-ad57-44a4-9089-7558c213284b"). InnerVolumeSpecName "kube-api-access-mmzjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.676703 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: if [ -n "" ]; then Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="" Mar 18 16:04:20 crc kubenswrapper[4939]: else Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:20 crc kubenswrapper[4939]: fi Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:20 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:20 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:20 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:20 crc kubenswrapper[4939]: # support updates Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.677611 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bwc5l"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.680164 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-ljqrx" podUID="f7a5f60f-451f-45ca-ad9d-62dc13bccf66" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.680223 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-ac46-account-create-update-fqfvp" podUID="8773b5a1-c7b3-40d3-b565-3f833db4e7ef" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.683008 4939 projected.go:194] Error preparing data for projected volume kube-api-access-2lkr5 for pod openstack/nova-cell1-f9fa-account-create-update-h5sth: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.686874 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5 podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:22.686806352 +0000 UTC m=+1627.285993973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2lkr5" (UniqueName: "kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.691720 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w8b45_d66d88bf-9a85-4958-a731-258e55b7ae99/openstack-network-exporter/0.log" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.691773 4939 generic.go:334] "Generic (PLEG): container finished" podID="d66d88bf-9a85-4958-a731-258e55b7ae99" containerID="8c56802c6a7daa3601b078a23b4cc0855237da751647d5e5c060ea419f09f0f9" exitCode=2 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.691864 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8b45" event={"ID":"d66d88bf-9a85-4958-a731-258e55b7ae99","Type":"ContainerDied","Data":"8c56802c6a7daa3601b078a23b4cc0855237da751647d5e5c060ea419f09f0f9"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.704428 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-s2smr"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.713653 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bzpsk"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.722067 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bwc5l"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.724183 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "08510a7a-ad57-44a4-9089-7558c213284b" (UID: "08510a7a-ad57-44a4-9089-7558c213284b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.725670 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: if [ -n "nova_cell0" ]; then Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="nova_cell0" Mar 18 16:04:20 crc kubenswrapper[4939]: else Mar 18 16:04:20 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:20 crc kubenswrapper[4939]: fi Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:20 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:20 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:20 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:20 crc kubenswrapper[4939]: # support updates Mar 18 16:04:20 crc kubenswrapper[4939]: Mar 18 16:04:20 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.727488 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" podUID="abc847c2-3903-44d4-aa4d-0a7e16709041" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.727834 4939 generic.go:334] "Generic (PLEG): container finished" podID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerID="aba281f036d2b5e4f340ee24dde5ab36bf1f763f7ca8377787fe8cdc8c7f40da" exitCode=143 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.727954 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerDied","Data":"aba281f036d2b5e4f340ee24dde5ab36bf1f763f7ca8377787fe8cdc8c7f40da"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.734945 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzjm\" (UniqueName: \"kubernetes.io/projected/08510a7a-ad57-44a4-9089-7558c213284b-kube-api-access-mmzjm\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.734982 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.735058 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.735105 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts podName:f7a5f60f-451f-45ca-ad9d-62dc13bccf66 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:21.235087668 +0000 UTC m=+1625.834275289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts") pod "root-account-create-update-ljqrx" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.736290 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bzpsk"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.739401 4939 generic.go:334] "Generic (PLEG): container finished" podID="df7cba1f-8d56-47c9-8016-3184a1374386" containerID="c4b1ae5bcdd7929e16516d94f1d8e93e3c240e0dba2fbbe3ab7b4b2d344bbbb5" exitCode=143 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.739660 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-prjpr" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="registry-server" containerID="cri-o://006bb20fbcdd84b51be7b526c598c7ecd52ed39ad5a155207701e52879954474" gracePeriod=2 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.739808 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerDied","Data":"c4b1ae5bcdd7929e16516d94f1d8e93e3c240e0dba2fbbe3ab7b4b2d344bbbb5"} Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.752325 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.769253 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.769571 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74465b498-l8mz2" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api-log" containerID="cri-o://c8f94483b5054ecb13ba070dc607cd5aefe23c9ba71716aa8ae0e310aed70d7c" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.770083 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74465b498-l8mz2" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api" containerID="cri-o://c14895b390882b2c1923b6999a196557393e0c01f808b080026c9ef7f7e1d1fd" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.782849 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.783157 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker-log" containerID="cri-o://944a706537fc625973ddcc25ce21c69f44e3b1c0a43d6217239bde407bac3b36" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.783687 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker" containerID="cri-o://50ab95e7c021b4416c51b15d476a07a0ef51bd8ab440f9053107700d4fa85ea2" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.799079 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.799406 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener-log" containerID="cri-o://e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.799963 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener" containerID="cri-o://e02a12f9caf7aff3ab019570b585d66908b0f4cfe93a1f0a694c9e99cffe1bfe" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.817225 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.817297 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.818217 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fd953f11b183a67f290c13bd5195a8236a5c036552d0d61a21e2836c0d47adfb" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.827457 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.833331 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.837094 4939 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.837158 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.837144427 +0000 UTC m=+1629.436332048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-config-data" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.837195 4939 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.837218 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.837210409 +0000 UTC m=+1629.436398030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-scripts" not found Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.845537 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.847381 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08510a7a-ad57-44a4-9089-7558c213284b" (UID: "08510a7a-ad57-44a4-9089-7558c213284b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.852262 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.905098 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="rabbitmq" containerID="cri-o://a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f" gracePeriod=604800 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.907415 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.909700 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "08510a7a-ad57-44a4-9089-7558c213284b" (UID: "08510a7a-ad57-44a4-9089-7558c213284b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.938209 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.938237 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08510a7a-ad57-44a4-9089-7558c213284b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.943162 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.947952 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.955816 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.956125 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" containerID="cri-o://3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.960225 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="rabbitmq" containerID="cri-o://1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514" gracePeriod=604800 Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.964370 4939 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 16:04:20 crc kubenswrapper[4939]: + source /usr/local/bin/container-scripts/functions Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNBridge=br-int Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNRemote=tcp:localhost:6642 Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNEncapType=geneve Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNAvailabilityZones= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ EnableChassisAsGateway=true Mar 18 16:04:20 crc kubenswrapper[4939]: ++ PhysicalNetworks= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNHostName= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 16:04:20 crc kubenswrapper[4939]: ++ ovs_dir=/var/lib/openvswitch Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 16:04:20 crc kubenswrapper[4939]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + sleep 0.5 Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + cleanup_ovsdb_server_semaphore Mar 18 16:04:20 crc kubenswrapper[4939]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 16:04:20 crc kubenswrapper[4939]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-56pdq" message=< Mar 18 16:04:20 crc kubenswrapper[4939]: Exiting ovsdb-server (5) [ OK ] Mar 18 16:04:20 crc kubenswrapper[4939]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 16:04:20 crc kubenswrapper[4939]: + source /usr/local/bin/container-scripts/functions Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNBridge=br-int Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNRemote=tcp:localhost:6642 Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNEncapType=geneve Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNAvailabilityZones= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ EnableChassisAsGateway=true Mar 18 16:04:20 crc kubenswrapper[4939]: ++ PhysicalNetworks= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNHostName= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 16:04:20 crc kubenswrapper[4939]: ++ ovs_dir=/var/lib/openvswitch Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 16:04:20 crc kubenswrapper[4939]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + sleep 0.5 Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + cleanup_ovsdb_server_semaphore Mar 18 16:04:20 crc kubenswrapper[4939]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 16:04:20 crc kubenswrapper[4939]: > Mar 18 16:04:20 crc kubenswrapper[4939]: E0318 16:04:20.964454 4939 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 16:04:20 crc kubenswrapper[4939]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 16:04:20 crc kubenswrapper[4939]: + source /usr/local/bin/container-scripts/functions Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNBridge=br-int Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNRemote=tcp:localhost:6642 Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNEncapType=geneve Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNAvailabilityZones= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ EnableChassisAsGateway=true Mar 18 16:04:20 crc kubenswrapper[4939]: ++ PhysicalNetworks= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ OVNHostName= Mar 18 16:04:20 crc kubenswrapper[4939]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 16:04:20 crc kubenswrapper[4939]: ++ ovs_dir=/var/lib/openvswitch Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 16:04:20 crc kubenswrapper[4939]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 16:04:20 crc kubenswrapper[4939]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + sleep 0.5 Mar 18 16:04:20 crc kubenswrapper[4939]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 16:04:20 crc kubenswrapper[4939]: + cleanup_ovsdb_server_semaphore Mar 18 16:04:20 crc kubenswrapper[4939]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 16:04:20 crc kubenswrapper[4939]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 16:04:20 crc kubenswrapper[4939]: > pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" containerID="cri-o://de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.964525 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" containerID="cri-o://de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" gracePeriod=29 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.984071 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="galera" containerID="cri-o://58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29" gracePeriod=30 Mar 18 16:04:20 crc kubenswrapper[4939]: I0318 16:04:20.996677 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-98ww2"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.009368 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-98ww2"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.029288 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.030490 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" gracePeriod=30 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.044407 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w8b45_d66d88bf-9a85-4958-a731-258e55b7ae99/openstack-network-exporter/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.044471 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8b45" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.050914 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.051096 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" gracePeriod=30 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.081958 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" containerID="cri-o://8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" gracePeriod=29 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.084294 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l7cqx"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.086948 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e964ed3-1c22-4d0b-b6eb-45df177b2f33/ovsdbserver-sb/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.087014 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.092196 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0aa357a6-3028-4413-b384-0cbf6488f7ef/ovsdbserver-nb/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.092275 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.092813 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l7cqx"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.113379 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.119594 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:21 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: if [ -n "barbican" ]; then Mar 18 16:04:21 crc kubenswrapper[4939]: GRANT_DATABASE="barbican" Mar 18 16:04:21 crc kubenswrapper[4939]: else Mar 18 16:04:21 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:21 crc kubenswrapper[4939]: fi Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:21 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:21 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:21 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:21 crc kubenswrapper[4939]: # support updates Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.120962 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-86d4-account-create-update-795z4" podUID="73635e3b-60a5-46e9-bae0-caf61d8c9e74" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.121384 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.151776 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.152032 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.152416 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.152708 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr9wk\" (UniqueName: \"kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.152922 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4zdt\" (UniqueName: \"kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155583 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zrl\" (UniqueName: \"kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155665 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.152892 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155155 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155866 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.155950 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156014 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156102 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156204 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156293 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156388 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156532 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156609 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156776 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle\") pod \"0aa357a6-3028-4413-b384-0cbf6488f7ef\" (UID: \"0aa357a6-3028-4413-b384-0cbf6488f7ef\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156840 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts" (OuterVolumeSpecName: "scripts") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156847 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts\") pod \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\" (UID: \"8e964ed3-1c22-4d0b-b6eb-45df177b2f33\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.156921 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs\") pod \"d66d88bf-9a85-4958-a731-258e55b7ae99\" (UID: \"d66d88bf-9a85-4958-a731-258e55b7ae99\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.158303 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts" (OuterVolumeSpecName: "scripts") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.158402 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.158828 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config" (OuterVolumeSpecName: "config") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159053 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159141 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa357a6-3028-4413-b384-0cbf6488f7ef-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159217 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159294 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159366 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.159444 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.160309 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config" (OuterVolumeSpecName: "config") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.160612 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl" (OuterVolumeSpecName: "kube-api-access-n5zrl") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "kube-api-access-n5zrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.160947 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config" (OuterVolumeSpecName: "config") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.165715 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.169060 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.175930 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk" (OuterVolumeSpecName: "kube-api-access-pr9wk") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "kube-api-access-pr9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.176856 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.180842 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt" (OuterVolumeSpecName: "kube-api-access-g4zdt") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "kube-api-access-g4zdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.187424 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f68996_05bc_4432_ac98_c730b09c6288.slice/crio-de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de1bfe9_c6f0_46c0_bd41_318b139b0f41.slice/crio-c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f68996_05bc_4432_ac98_c730b09c6288.slice/crio-conmon-de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c83b398_2fa8_4862_a2fe_6f66e3200216.slice/crio-conmon-e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de1bfe9_c6f0_46c0_bd41_318b139b0f41.slice/crio-8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.214275 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.226087 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.226131 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.237641 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.238041 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.240075 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.247940 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.248292 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.252058 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.252148 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.260253 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.260418 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.260677 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.260774 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2cs\" (UniqueName: \"kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.260905 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.261061 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb\") pod \"be388cde-0dc7-4b42-a62c-f790b70391c6\" (UID: \"be388cde-0dc7-4b42-a62c-f790b70391c6\") " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.261575 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66d88bf-9a85-4958-a731-258e55b7ae99-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.261884 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262568 4939 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d66d88bf-9a85-4958-a731-258e55b7ae99-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262671 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262776 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262854 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr9wk\" (UniqueName: \"kubernetes.io/projected/0aa357a6-3028-4413-b384-0cbf6488f7ef-kube-api-access-pr9wk\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262917 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4zdt\" (UniqueName: \"kubernetes.io/projected/d66d88bf-9a85-4958-a731-258e55b7ae99-kube-api-access-g4zdt\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262994 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zrl\" (UniqueName: \"kubernetes.io/projected/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-kube-api-access-n5zrl\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.263087 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.263232 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.262117 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.262595 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.267481 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts podName:f7a5f60f-451f-45ca-ad9d-62dc13bccf66 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:22.267447751 +0000 UTC m=+1626.866635372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts") pod "root-account-create-update-ljqrx" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.303678 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs" (OuterVolumeSpecName: "kube-api-access-pt2cs") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "kube-api-access-pt2cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.351025 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.366915 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2cs\" (UniqueName: \"kubernetes.io/projected/be388cde-0dc7-4b42-a62c-f790b70391c6-kube-api-access-pt2cs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.366972 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.366986 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.384693 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.440070 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.441906 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.448145 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.462729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.481373 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.481431 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.481556 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data podName:d850ac81-a29e-4e93-9fab-72b6325de52e nodeName:}" failed. No retries permitted until 2026-03-18 16:04:25.481534448 +0000 UTC m=+1630.080722079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data") pod "rabbitmq-cell1-server-0" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e") : configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.481873 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.481916 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.481931 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.481944 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.486240 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config" (OuterVolumeSpecName: "config") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.505134 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.510934 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "be388cde-0dc7-4b42-a62c-f790b70391c6" (UID: "be388cde-0dc7-4b42-a62c-f790b70391c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.528615 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d66d88bf-9a85-4958-a731-258e55b7ae99" (UID: "d66d88bf-9a85-4958-a731-258e55b7ae99"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.537413 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0aa357a6-3028-4413-b384-0cbf6488f7ef" (UID: "0aa357a6-3028-4413-b384-0cbf6488f7ef"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.567012 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "8e964ed3-1c22-4d0b-b6eb-45df177b2f33" (UID: "8e964ed3-1c22-4d0b-b6eb-45df177b2f33"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584730 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584781 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e964ed3-1c22-4d0b-b6eb-45df177b2f33-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584795 4939 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584807 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa357a6-3028-4413-b384-0cbf6488f7ef-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584818 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be388cde-0dc7-4b42-a62c-f790b70391c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.584830 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66d88bf-9a85-4958-a731-258e55b7ae99-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.686265 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.686655 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data podName:26f60b5c-7d32-4fea-b3ca-a8132f3ed026 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:23.686638772 +0000 UTC m=+1628.285826393 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data") pod "rabbitmq-server-0" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026") : configmap "rabbitmq-config-data" not found Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.764165 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8e964ed3-1c22-4d0b-b6eb-45df177b2f33/ovsdbserver-sb/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.764301 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.764298 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8e964ed3-1c22-4d0b-b6eb-45df177b2f33","Type":"ContainerDied","Data":"32bf57067fcb26420b62eb765a24b8522a5894c707ee084079247d1fa428e5ab"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.764356 4939 scope.go:117] "RemoveContainer" containerID="4e455ae5d5a3238bfc635b57f221285060c55f8f3b0f69f228d7592bc6b0442d" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.784347 4939 generic.go:334] "Generic (PLEG): container finished" podID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerID="9935f89912bbc9ccc732826d2efbd0df5765843a5c1ca0671ee97d191011142c" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.784408 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerDied","Data":"9935f89912bbc9ccc732826d2efbd0df5765843a5c1ca0671ee97d191011142c"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.788240 4939 generic.go:334] "Generic (PLEG): container finished" podID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerID="e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.788297 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerDied","Data":"e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.790318 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0aa357a6-3028-4413-b384-0cbf6488f7ef/ovsdbserver-nb/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.790366 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0aa357a6-3028-4413-b384-0cbf6488f7ef","Type":"ContainerDied","Data":"4aab8b140c6c62d51bda2069f7bca7427b5d1cd43b25057ff4df8994f7ac04fd"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.790448 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.800353 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w8b45_d66d88bf-9a85-4958-a731-258e55b7ae99/openstack-network-exporter/0.log" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.800420 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8b45" event={"ID":"d66d88bf-9a85-4958-a731-258e55b7ae99","Type":"ContainerDied","Data":"d2ce4dbece6686c3452b2c0b1ad45fe5a02108b975ec72d873f64d8337a53996"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.800497 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8b45" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.803163 4939 generic.go:334] "Generic (PLEG): container finished" podID="a2d02491-90d4-41b4-884d-0959feb366b0" containerID="4607f5748ea35230c9f8cb3745bf14422c3e81e916ca3f8b80878128c7ed0bca" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.803232 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerDied","Data":"4607f5748ea35230c9f8cb3745bf14422c3e81e916ca3f8b80878128c7ed0bca"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.820393 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="f258992948b744aaf9d67e3c6a706143ae30356b748f5de908345ee552ac4c49" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.820424 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="91a292f6fe7a26c5c29f8f381a1c70bc1a4b7445389fcb1c53cbb76f807c045d" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.820474 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"f258992948b744aaf9d67e3c6a706143ae30356b748f5de908345ee552ac4c49"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.820521 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"91a292f6fe7a26c5c29f8f381a1c70bc1a4b7445389fcb1c53cbb76f807c045d"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.825594 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" event={"ID":"be388cde-0dc7-4b42-a62c-f790b70391c6","Type":"ContainerDied","Data":"fcb0c665b45b585050778d9fdd48d53c42fe0e155c812c70d4f78934cfc0de1d"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.825725 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rqsf2" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.835287 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-fqfvp" event={"ID":"8773b5a1-c7b3-40d3-b565-3f833db4e7ef","Type":"ContainerStarted","Data":"a835d6e65c39d11fbe97358798315911e62607ee8da819c6865fce3ff079dd96"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.858012 4939 generic.go:334] "Generic (PLEG): container finished" podID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerID="8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.858045 4939 generic.go:334] "Generic (PLEG): container finished" podID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerID="c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.858126 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerDied","Data":"8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.858158 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerDied","Data":"c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.868185 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-795z4" event={"ID":"73635e3b-60a5-46e9-bae0-caf61d8c9e74","Type":"ContainerStarted","Data":"35dd2998a57661a5923f905a899244b4b1239d1bb7b049678b85de40efe84ac0"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.876476 4939 generic.go:334] "Generic (PLEG): container finished" podID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerID="6e356ce75b4202de8978925650d9357e6b5b88d4f5afbb33cbd95353ff874608" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.876588 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerDied","Data":"6e356ce75b4202de8978925650d9357e6b5b88d4f5afbb33cbd95353ff874608"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.880090 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.882599 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883055 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="ovsdbserver-sb" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883077 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="ovsdbserver-sb" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883096 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="ovsdbserver-nb" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883104 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="ovsdbserver-nb" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883129 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="dnsmasq-dns" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883136 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="dnsmasq-dns" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883152 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="init" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883161 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="init" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883172 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883179 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883190 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66d88bf-9a85-4958-a731-258e55b7ae99" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883197 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66d88bf-9a85-4958-a731-258e55b7ae99" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.883212 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883219 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883438 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="ovsdbserver-sb" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883450 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883466 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883482 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" containerName="dnsmasq-dns" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883494 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66d88bf-9a85-4958-a731-258e55b7ae99" containerName="openstack-network-exporter" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.883529 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" containerName="ovsdbserver-nb" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.884210 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.887299 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.888269 4939 generic.go:334] "Generic (PLEG): container finished" podID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerID="c8f94483b5054ecb13ba070dc607cd5aefe23c9ba71716aa8ae0e310aed70d7c" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.888414 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerDied","Data":"c8f94483b5054ecb13ba070dc607cd5aefe23c9ba71716aa8ae0e310aed70d7c"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.896541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" event={"ID":"abc847c2-3903-44d4-aa4d-0a7e16709041","Type":"ContainerStarted","Data":"1fdb9853930da2cb76c945d655bd4514cd8df54d11038f39529feae5264ca418"} Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.905004 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:21 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: if [ -n "nova_cell0" ]; then Mar 18 16:04:21 crc kubenswrapper[4939]: GRANT_DATABASE="nova_cell0" Mar 18 16:04:21 crc kubenswrapper[4939]: else Mar 18 16:04:21 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:21 crc kubenswrapper[4939]: fi Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:21 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:21 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:21 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:21 crc kubenswrapper[4939]: # support updates Mar 18 16:04:21 crc kubenswrapper[4939]: Mar 18 16:04:21 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:21 crc kubenswrapper[4939]: E0318 16:04:21.906247 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" podUID="abc847c2-3903-44d4-aa4d-0a7e16709041" Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.907207 4939 generic.go:334] "Generic (PLEG): container finished" podID="50baf265-a6d8-445d-aed6-853781644d9e" containerID="d1edf6d434d6afaad7a360d12012a796e03fa6b5275394e5c67c30d4f75cf0c3" exitCode=143 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.907270 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerDied","Data":"d1edf6d434d6afaad7a360d12012a796e03fa6b5275394e5c67c30d4f75cf0c3"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.922992 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.967382 4939 generic.go:334] "Generic (PLEG): container finished" podID="52f68996-05bc-4432-ac98-c730b09c6288" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.967465 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerDied","Data":"de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.994435 4939 generic.go:334] "Generic (PLEG): container finished" podID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" containerID="fd953f11b183a67f290c13bd5195a8236a5c036552d0d61a21e2836c0d47adfb" exitCode=0 Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.994523 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666","Type":"ContainerDied","Data":"fd953f11b183a67f290c13bd5195a8236a5c036552d0d61a21e2836c0d47adfb"} Mar 18 16:04:21 crc kubenswrapper[4939]: I0318 16:04:21.998017 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.000394 4939 scope.go:117] "RemoveContainer" containerID="42c99df8c4f51d15393e20d51c9feb5cd6360994ff5aa297b56c8b97ec9c449e" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.009607 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.017647 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.018259 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.022053 4939 generic.go:334] "Generic (PLEG): container finished" podID="f757e65c-c660-4614-bb43-38b9beb092e9" containerID="c87e18d10fc916c7b05bb350ccbb835b08683349a3ae8b07119f660954350f76" exitCode=0 Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.022141 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerDied","Data":"c87e18d10fc916c7b05bb350ccbb835b08683349a3ae8b07119f660954350f76"} Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.023713 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgg4\" (UniqueName: \"kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.035114 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.038659 4939 generic.go:334] "Generic (PLEG): container finished" podID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerID="006bb20fbcdd84b51be7b526c598c7ecd52ed39ad5a155207701e52879954474" exitCode=0 Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.038811 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prjpr" event={"ID":"cefea5f4-f44f-4e73-b762-38adc00ce1eb","Type":"ContainerDied","Data":"006bb20fbcdd84b51be7b526c598c7ecd52ed39ad5a155207701e52879954474"} Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.038917 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prjpr" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.052716 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.060624 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.064179 4939 generic.go:334] "Generic (PLEG): container finished" podID="86474b5e-6fc8-4810-a083-699878062ade" containerID="944a706537fc625973ddcc25ce21c69f44e3b1c0a43d6217239bde407bac3b36" exitCode=143 Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.064450 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerDied","Data":"944a706537fc625973ddcc25ce21c69f44e3b1c0a43d6217239bde407bac3b36"} Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.064538 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.065028 4939 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-ljqrx" secret="" err="secret \"galera-openstack-cell1-dockercfg-5pfng\" not found" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.071604 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-w8b45"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.083851 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.095067 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.095189 4939 scope.go:117] "RemoveContainer" containerID="16ff00962bb2fdbc01658731042bb863bfb945a5da3765b64fe956c1721303df" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.101772 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:22 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: if [ -n "nova_api" ]; then Mar 18 16:04:22 crc kubenswrapper[4939]: GRANT_DATABASE="nova_api" Mar 18 16:04:22 crc kubenswrapper[4939]: else Mar 18 16:04:22 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:22 crc kubenswrapper[4939]: fi Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:22 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:22 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:22 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:22 crc kubenswrapper[4939]: # support updates Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.103218 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-409c-account-create-update-m2tw5" podUID="77b48752-eea3-4627-8da2-737f8bd7b36a" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.107108 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.127663 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") pod \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.127855 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:04:22 crc kubenswrapper[4939]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: if [ -n "" ]; then Mar 18 16:04:22 crc kubenswrapper[4939]: GRANT_DATABASE="" Mar 18 16:04:22 crc kubenswrapper[4939]: else Mar 18 16:04:22 crc kubenswrapper[4939]: GRANT_DATABASE="*" Mar 18 16:04:22 crc kubenswrapper[4939]: fi Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: # going for maximum compatibility here: Mar 18 16:04:22 crc kubenswrapper[4939]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 16:04:22 crc kubenswrapper[4939]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 16:04:22 crc kubenswrapper[4939]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 16:04:22 crc kubenswrapper[4939]: # support updates Mar 18 16:04:22 crc kubenswrapper[4939]: Mar 18 16:04:22 crc kubenswrapper[4939]: $MYSQL_CMD < logger="UnhandledError" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.127925 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rqsf2"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.127950 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc6dv\" (UniqueName: \"kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv\") pod \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.127991 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities\") pod \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.128532 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.129682 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgg4\" (UniqueName: \"kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.130583 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-ljqrx" podUID="f7a5f60f-451f-45ca-ad9d-62dc13bccf66" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.130857 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities" (OuterVolumeSpecName: "utilities") pod "cefea5f4-f44f-4e73-b762-38adc00ce1eb" (UID: "cefea5f4-f44f-4e73-b762-38adc00ce1eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.131023 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.177474 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgg4\" (UniqueName: \"kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4\") pod \"root-account-create-update-2dr5q\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.181615 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv" (OuterVolumeSpecName: "kube-api-access-bc6dv") pod "cefea5f4-f44f-4e73-b762-38adc00ce1eb" (UID: "cefea5f4-f44f-4e73-b762-38adc00ce1eb"). InnerVolumeSpecName "kube-api-access-bc6dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.193058 4939 scope.go:117] "RemoveContainer" containerID="9f0915b2d44468cfeaa689e644757be5cca3a62f3db9243ad5810dad5952f44d" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.208478 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.225171 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03137eb0-6a57-4dc2-91aa-e7af80abbd22" path="/var/lib/kubelet/pods/03137eb0-6a57-4dc2-91aa-e7af80abbd22/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.226799 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08510a7a-ad57-44a4-9089-7558c213284b" path="/var/lib/kubelet/pods/08510a7a-ad57-44a4-9089-7558c213284b/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.227588 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa357a6-3028-4413-b384-0cbf6488f7ef" path="/var/lib/kubelet/pods/0aa357a6-3028-4413-b384-0cbf6488f7ef/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.228961 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181cba20-17ba-4fdd-9843-e452e9e2cce9" path="/var/lib/kubelet/pods/181cba20-17ba-4fdd-9843-e452e9e2cce9/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.229613 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23191f5f-fe02-4b74-ab9c-95b03d308980" path="/var/lib/kubelet/pods/23191f5f-fe02-4b74-ab9c-95b03d308980/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.230187 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2512655d-ad91-4302-9136-15c7ff20e928" path="/var/lib/kubelet/pods/2512655d-ad91-4302-9136-15c7ff20e928/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231347 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cefea5f4-f44f-4e73-b762-38adc00ce1eb" (UID: "cefea5f4-f44f-4e73-b762-38adc00ce1eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231458 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") pod \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\" (UID: \"cefea5f4-f44f-4e73-b762-38adc00ce1eb\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231575 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data\") pod \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231639 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle\") pod \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231699 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs\") pod \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231823 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zwz7\" (UniqueName: \"kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7\") pod \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.231861 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs\") pod \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\" (UID: \"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666\") " Mar 18 16:04:22 crc kubenswrapper[4939]: W0318 16:04:22.232609 4939 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cefea5f4-f44f-4e73-b762-38adc00ce1eb/volumes/kubernetes.io~empty-dir/catalog-content Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.232636 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cefea5f4-f44f-4e73-b762-38adc00ce1eb" (UID: "cefea5f4-f44f-4e73-b762-38adc00ce1eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.234833 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d349c7-2307-47dd-a696-adfdfba42e1e" path="/var/lib/kubelet/pods/71d349c7-2307-47dd-a696-adfdfba42e1e/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.242077 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e964ed3-1c22-4d0b-b6eb-45df177b2f33" path="/var/lib/kubelet/pods/8e964ed3-1c22-4d0b-b6eb-45df177b2f33/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.255546 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac644e3b-085b-406a-965c-6c68407003e5" path="/var/lib/kubelet/pods/ac644e3b-085b-406a-965c-6c68407003e5/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.256103 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be388cde-0dc7-4b42-a62c-f790b70391c6" path="/var/lib/kubelet/pods/be388cde-0dc7-4b42-a62c-f790b70391c6/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.256773 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66d88bf-9a85-4958-a731-258e55b7ae99" path="/var/lib/kubelet/pods/d66d88bf-9a85-4958-a731-258e55b7ae99/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.270237 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc6dv\" (UniqueName: \"kubernetes.io/projected/cefea5f4-f44f-4e73-b762-38adc00ce1eb-kube-api-access-bc6dv\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.270271 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.270290 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefea5f4-f44f-4e73-b762-38adc00ce1eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.277892 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7" (OuterVolumeSpecName: "kube-api-access-6zwz7") pod "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" (UID: "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666"). InnerVolumeSpecName "kube-api-access-6zwz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.284012 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd94833e-ea91-4e35-9d05-e0cdf969b281" path="/var/lib/kubelet/pods/dd94833e-ea91-4e35-9d05-e0cdf969b281/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.284848 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de888787-cfa6-46be-bb5b-a45e06eddb1b" path="/var/lib/kubelet/pods/de888787-cfa6-46be-bb5b-a45e06eddb1b/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.285356 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfc0fae-1947-4c07-9be5-6ce1b49d0d15" path="/var/lib/kubelet/pods/ecfc0fae-1947-4c07-9be5-6ce1b49d0d15/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.286028 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0eb0f89-9573-4e53-a22c-16b8cd80140a" path="/var/lib/kubelet/pods/f0eb0f89-9573-4e53-a22c-16b8cd80140a/volumes" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.294757 4939 scope.go:117] "RemoveContainer" containerID="8c56802c6a7daa3601b078a23b4cc0855237da751647d5e5c060ea419f09f0f9" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.300173 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.324133 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" (UID: "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.367401 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data" (OuterVolumeSpecName: "config-data") pod "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" (UID: "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.371648 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.371958 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372000 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372029 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.372257 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl\") pod \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\" (UID: \"8de1bfe9-c6f0-46c0-bd41-318b139b0f41\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.376575 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zwz7\" (UniqueName: \"kubernetes.io/projected/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-kube-api-access-6zwz7\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.376620 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.376638 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.376744 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.376805 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts podName:f7a5f60f-451f-45ca-ad9d-62dc13bccf66 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.376784999 +0000 UTC m=+1628.975972800 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts") pod "root-account-create-update-ljqrx" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.386669 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.390304 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.390719 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.403884 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.406266 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.412780 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-prjpr"] Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.423769 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl" (OuterVolumeSpecName: "kube-api-access-682gl") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "kube-api-access-682gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.426977 4939 scope.go:117] "RemoveContainer" containerID="4aa3a3cedd5d1ba6985398a8cde097556bb47737cbb52dca79f049e1751ffcc0" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.489039 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts\") pod \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.489343 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trd4x\" (UniqueName: \"kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x\") pod \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\" (UID: \"8773b5a1-c7b3-40d3-b565-3f833db4e7ef\") " Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.490624 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.490645 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-kube-api-access-682gl\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.490659 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.490675 4939 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.495976 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" (UID: "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.501264 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8773b5a1-c7b3-40d3-b565-3f833db4e7ef" (UID: "8773b5a1-c7b3-40d3-b565-3f833db4e7ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.524732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x" (OuterVolumeSpecName: "kube-api-access-trd4x") pod "8773b5a1-c7b3-40d3-b565-3f833db4e7ef" (UID: "8773b5a1-c7b3-40d3-b565-3f833db4e7ef"). InnerVolumeSpecName "kube-api-access-trd4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.527345 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.583041 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.592923 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trd4x\" (UniqueName: \"kubernetes.io/projected/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-kube-api-access-trd4x\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.592959 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.592970 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8773b5a1-c7b3-40d3-b565-3f833db4e7ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.592986 4939 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.593008 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.601632 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data" (OuterVolumeSpecName: "config-data") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.620274 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" (UID: "dccb7f64-b0e8-4fc1-b1d7-1a24eec17666"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.620921 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8de1bfe9-c6f0-46c0-bd41-318b139b0f41" (UID: "8de1bfe9-c6f0-46c0-bd41-318b139b0f41"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.694047 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8 is running failed: container process not found" containerID="7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.694643 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8 is running failed: container process not found" containerID="7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.694915 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8 is running failed: container process not found" containerID="7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.694939 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerName="nova-cell0-conductor-conductor" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.696438 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.696589 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") pod \"nova-cell1-f9fa-account-create-update-h5sth\" (UID: \"6884ec47-c51b-49dd-8b73-593328a782fe\") " pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.696779 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.696796 4939 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.696806 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de1bfe9-c6f0-46c0-bd41-318b139b0f41-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.696884 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.696938 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:26.696906325 +0000 UTC m=+1631.296093946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.710191 4939 projected.go:194] Error preparing data for projected volume kube-api-access-2lkr5 for pod openstack/nova-cell1-f9fa-account-create-update-h5sth: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:22 crc kubenswrapper[4939]: E0318 16:04:22.710246 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5 podName:6884ec47-c51b-49dd-8b73-593328a782fe nodeName:}" failed. No retries permitted until 2026-03-18 16:04:26.71023083 +0000 UTC m=+1631.309418451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2lkr5" (UniqueName: "kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5") pod "nova-cell1-f9fa-account-create-update-h5sth" (UID: "6884ec47-c51b-49dd-8b73-593328a782fe") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.788818 4939 scope.go:117] "RemoveContainer" containerID="d3c3b598ec28d293c825f1d0d8a6e7d069aaa425447f5ae732747e50bc76e8c6" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.849163 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.854408 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.885205 4939 scope.go:117] "RemoveContainer" containerID="3e958f60a62d00010cc8b06118e35abed1735c3a72b9d800113fbbaacdb1ee62" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.909628 4939 scope.go:117] "RemoveContainer" containerID="006bb20fbcdd84b51be7b526c598c7ecd52ed39ad5a155207701e52879954474" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.950665 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:22 crc kubenswrapper[4939]: I0318 16:04:22.951761 4939 scope.go:117] "RemoveContainer" containerID="f1400e5882a916afb311d9b971d5a0db2f34c1f124fd886dfbc59239d21493f5" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.002791 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts\") pod \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.002875 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.002955 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003009 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003081 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpbb4\" (UniqueName: \"kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4\") pod \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\" (UID: \"73635e3b-60a5-46e9-bae0-caf61d8c9e74\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003104 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtfg\" (UniqueName: \"kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003126 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003155 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003230 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.003255 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config\") pod \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\" (UID: \"2e6bea5c-5909-40a3-8b9d-d3072855f3da\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.004564 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.006256 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73635e3b-60a5-46e9-bae0-caf61d8c9e74" (UID: "73635e3b-60a5-46e9-bae0-caf61d8c9e74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.006727 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.006882 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.010240 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.012691 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4" (OuterVolumeSpecName: "kube-api-access-hpbb4") pod "73635e3b-60a5-46e9-bae0-caf61d8c9e74" (UID: "73635e3b-60a5-46e9-bae0-caf61d8c9e74"). InnerVolumeSpecName "kube-api-access-hpbb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.021864 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg" (OuterVolumeSpecName: "kube-api-access-nqtfg") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "kube-api-access-nqtfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.025180 4939 scope.go:117] "RemoveContainer" containerID="91ef614c09ef3a14bffbbfbb3efbb00ccbda43fbf6cdd41c6ff512bbb4203e93" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.033295 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.048660 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.064846 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2e6bea5c-5909-40a3-8b9d-d3072855f3da" (UID: "2e6bea5c-5909-40a3-8b9d-d3072855f3da"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.084487 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dccb7f64-b0e8-4fc1-b1d7-1a24eec17666","Type":"ContainerDied","Data":"26b1b68141ee4115efbefc73f870731415408c53354fcc9d891efecb5768be41"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.084550 4939 scope.go:117] "RemoveContainer" containerID="fd953f11b183a67f290c13bd5195a8236a5c036552d0d61a21e2836c0d47adfb" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.084658 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.097693 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ac46-account-create-update-fqfvp" event={"ID":"8773b5a1-c7b3-40d3-b565-3f833db4e7ef","Type":"ContainerDied","Data":"a835d6e65c39d11fbe97358798315911e62607ee8da819c6865fce3ff079dd96"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.098065 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ac46-account-create-update-fqfvp" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.105243 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts\") pod \"409450a4-dba2-432c-8c4b-0cc14057937d\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.105293 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4z5\" (UniqueName: \"kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5\") pod \"409450a4-dba2-432c-8c4b-0cc14057937d\" (UID: \"409450a4-dba2-432c-8c4b-0cc14057937d\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106055 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106078 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpbb4\" (UniqueName: \"kubernetes.io/projected/73635e3b-60a5-46e9-bae0-caf61d8c9e74-kube-api-access-hpbb4\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106089 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqtfg\" (UniqueName: \"kubernetes.io/projected/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kube-api-access-nqtfg\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106111 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106130 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106138 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106149 4939 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e6bea5c-5909-40a3-8b9d-d3072855f3da-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106158 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73635e3b-60a5-46e9-bae0-caf61d8c9e74-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106166 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e6bea5c-5909-40a3-8b9d-d3072855f3da-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106175 4939 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bea5c-5909-40a3-8b9d-d3072855f3da-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.106179 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "409450a4-dba2-432c-8c4b-0cc14057937d" (UID: "409450a4-dba2-432c-8c4b-0cc14057937d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.112995 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bb7666d55-9qg76" event={"ID":"8de1bfe9-c6f0-46c0-bd41-318b139b0f41","Type":"ContainerDied","Data":"0be59d410e3072fb69049312b96d052f189a9925b1b54535e62738cf2f5cc173"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.113147 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bb7666d55-9qg76" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.120891 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.124774 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86d4-account-create-update-795z4" event={"ID":"73635e3b-60a5-46e9-bae0-caf61d8c9e74","Type":"ContainerDied","Data":"35dd2998a57661a5923f905a899244b4b1239d1bb7b049678b85de40efe84ac0"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.124825 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86d4-account-create-update-795z4" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.138822 4939 scope.go:117] "RemoveContainer" containerID="8c27c54d024af7e070c51b4a2e852614526988b034a22bcdb0519cc69a2109e2" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.140862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5" (OuterVolumeSpecName: "kube-api-access-sg4z5") pod "409450a4-dba2-432c-8c4b-0cc14057937d" (UID: "409450a4-dba2-432c-8c4b-0cc14057937d"). InnerVolumeSpecName "kube-api-access-sg4z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.145024 4939 generic.go:334] "Generic (PLEG): container finished" podID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerID="58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29" exitCode=0 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.145089 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerDied","Data":"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.145115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2e6bea5c-5909-40a3-8b9d-d3072855f3da","Type":"ContainerDied","Data":"74e9d84fe008f61823cf0755b4ac84a2475bf7351e7b4c5c02867f22a7b133d4"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.145181 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.149567 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-aa27-account-create-update-m8m54" event={"ID":"409450a4-dba2-432c-8c4b-0cc14057937d","Type":"ContainerDied","Data":"c2a9113061bb2638edde5ab0d85837eba0118f4118fdc4c195c2715ef044bbe5"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.149658 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-aa27-account-create-update-m8m54" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.167365 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.173947 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.182767 4939 scope.go:117] "RemoveContainer" containerID="c683cbe216d5c9119b57b38f335e3411920f22a528bb4f3e011449cbc759d2ac" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.187576 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.210649 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/409450a4-dba2-432c-8c4b-0cc14057937d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.210673 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg4z5\" (UniqueName: \"kubernetes.io/projected/409450a4-dba2-432c-8c4b-0cc14057937d-kube-api-access-sg4z5\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.210686 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.222820 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerID="7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" exitCode=0 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.222919 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f9fa-account-create-update-h5sth" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.223746 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09","Type":"ContainerDied","Data":"7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8"} Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.266694 4939 scope.go:117] "RemoveContainer" containerID="58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.291945 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.293418 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.309965 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ac46-account-create-update-fqfvp"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.343571 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.356026 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7bb7666d55-9qg76"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.409677 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.426318 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqpm\" (UniqueName: \"kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm\") pod \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.426370 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data\") pod \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.426548 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle\") pod \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\" (UID: \"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09\") " Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.453140 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86d4-account-create-update-795z4"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.484475 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.488367 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm" (OuterVolumeSpecName: "kube-api-access-4lqpm") pod "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" (UID: "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09"). InnerVolumeSpecName "kube-api-access-4lqpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.489445 4939 scope.go:117] "RemoveContainer" containerID="1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.512898 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" (UID: "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.527182 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data" (OuterVolumeSpecName: "config-data") pod "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" (UID: "bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.537704 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.537862 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.538978 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqpm\" (UniqueName: \"kubernetes.io/projected/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-kube-api-access-4lqpm\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.538998 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.539008 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.548396 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.559370 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.559517 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.593963 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.594026 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.594242 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.594264 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.615046 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.188:9292/healthcheck\": read tcp 10.217.0.2:44578->10.217.0.188:9292: read: connection reset by peer" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.615097 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.188:9292/healthcheck\": read tcp 10.217.0.2:44592->10.217.0.188:9292: read: connection reset by peer" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.642143 4939 scope.go:117] "RemoveContainer" containerID="58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29" Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.646491 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29\": container with ID starting with 58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29 not found: ID does not exist" containerID="58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.646549 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29"} err="failed to get container status \"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29\": rpc error: code = NotFound desc = could not find container \"58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29\": container with ID starting with 58463fad3615e12343dcc7e5a5625f9d59c08857a674dc45421449b68fc82a29 not found: ID does not exist" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.646579 4939 scope.go:117] "RemoveContainer" containerID="1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa" Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.652705 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa\": container with ID starting with 1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa not found: ID does not exist" containerID="1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.652751 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa"} err="failed to get container status \"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa\": rpc error: code = NotFound desc = could not find container \"1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa\": container with ID starting with 1ab6e02bac4fcac560e39ddb02f563eaa4325287ca8c5c39409e19a15f1fe3fa not found: ID does not exist" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.688649 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.688690 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.710633 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.710926 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-central-agent" containerID="cri-o://8f7b8af072015fc83aa897993785d0ba6ece0ebbc142762b52dcd98324367fa6" gracePeriod=30 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.711316 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="proxy-httpd" containerID="cri-o://f3079f7fe5149bacaed20469d6bc740ba824c2bdc67fa400c6cfad0cecdc52c1" gracePeriod=30 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.711366 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="sg-core" containerID="cri-o://bcaf1ad5a30e3e16337fdbd45932541ca349149870b4da32747001a8591240e1" gracePeriod=30 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.711394 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-notification-agent" containerID="cri-o://cefcf8484139882612cbf2acdccd98b003d26c82bc17372d2b00824a4b7ba550" gracePeriod=30 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.714631 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 16:04:23 crc kubenswrapper[4939]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 18 16:04:23 crc kubenswrapper[4939]: > Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.752491 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 16:04:23 crc kubenswrapper[4939]: E0318 16:04:23.752613 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data podName:26f60b5c-7d32-4fea-b3ca-a8132f3ed026 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:27.752590105 +0000 UTC m=+1632.351777726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data") pod "rabbitmq-server-0" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026") : configmap "rabbitmq-config-data" not found Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.790837 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-h5sth"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.814755 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f9fa-account-create-update-h5sth"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.860286 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.183:8776/healthcheck\": read tcp 10.217.0.2:44960->10.217.0.183:8776: read: connection reset by peer" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.894587 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:36312->10.217.0.187:9292: read: connection reset by peer" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.894705 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:36314->10.217.0.187:9292: read: connection reset by peer" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.898812 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.899199 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" containerName="kube-state-metrics" containerID="cri-o://286772eca49783d61fb3dec6079a9246af7f326c26fda03b975b7c1244c77555" gracePeriod=30 Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.932750 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.948459 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-aa27-account-create-update-m8m54"] Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.966329 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lkr5\" (UniqueName: \"kubernetes.io/projected/6884ec47-c51b-49dd-8b73-593328a782fe-kube-api-access-2lkr5\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:23 crc kubenswrapper[4939]: I0318 16:04:23.966360 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6884ec47-c51b-49dd-8b73-593328a782fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.007193 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fdf2-account-create-update-5wwtx"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.033394 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.033700 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" containerName="memcached" containerID="cri-o://7ef584cd49ae7f589ecbcd4e4179803996e249f944e8b538430057de2525fe26" gracePeriod=30 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.056684 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fdf2-account-create-update-5wwtx"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.075878 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74465b498-l8mz2" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48222->10.217.0.167:9311: read: connection reset by peer" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.076209 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74465b498-l8mz2" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48210->10.217.0.167:9311: read: connection reset by peer" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079116 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fdf2-account-create-update-g58kg"] Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079597 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079613 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079628 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerName="nova-cell0-conductor-conductor" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079635 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerName="nova-cell0-conductor-conductor" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079652 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="galera" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079658 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="galera" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079671 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="extract-utilities" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079677 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="extract-utilities" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079692 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-server" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079699 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-server" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079709 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="mysql-bootstrap" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079715 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="mysql-bootstrap" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079724 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="extract-content" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079730 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="extract-content" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079747 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079752 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.079763 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="registry-server" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079769 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="registry-server" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079939 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079952 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" containerName="galera" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079965 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079984 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" containerName="registry-server" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.079992 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-server" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.080000 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" containerName="nova-cell0-conductor-conductor" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.080599 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.082189 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.115913 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vw2p8"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.128099 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vw2p8"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.169044 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.169190 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjqp\" (UniqueName: \"kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.175862 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6bea5c-5909-40a3-8b9d-d3072855f3da" path="/var/lib/kubelet/pods/2e6bea5c-5909-40a3-8b9d-d3072855f3da/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.178702 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3247c802-2337-43e1-b292-56c7f5c520c2" path="/var/lib/kubelet/pods/3247c802-2337-43e1-b292-56c7f5c520c2/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.179406 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409450a4-dba2-432c-8c4b-0cc14057937d" path="/var/lib/kubelet/pods/409450a4-dba2-432c-8c4b-0cc14057937d/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.180823 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635a2d97-8489-4a21-87ef-a30663aa441e" path="/var/lib/kubelet/pods/635a2d97-8489-4a21-87ef-a30663aa441e/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.181372 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6884ec47-c51b-49dd-8b73-593328a782fe" path="/var/lib/kubelet/pods/6884ec47-c51b-49dd-8b73-593328a782fe/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.182447 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73635e3b-60a5-46e9-bae0-caf61d8c9e74" path="/var/lib/kubelet/pods/73635e3b-60a5-46e9-bae0-caf61d8c9e74/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.182854 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8773b5a1-c7b3-40d3-b565-3f833db4e7ef" path="/var/lib/kubelet/pods/8773b5a1-c7b3-40d3-b565-3f833db4e7ef/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.183237 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" path="/var/lib/kubelet/pods/8de1bfe9-c6f0-46c0-bd41-318b139b0f41/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.197174 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefea5f4-f44f-4e73-b762-38adc00ce1eb" path="/var/lib/kubelet/pods/cefea5f4-f44f-4e73-b762-38adc00ce1eb/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.198099 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dccb7f64-b0e8-4fc1-b1d7-1a24eec17666" path="/var/lib/kubelet/pods/dccb7f64-b0e8-4fc1-b1d7-1a24eec17666/volumes" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.198951 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fdf2-account-create-update-g58kg"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.198978 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8brg8"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.207360 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cron-29564161-466kv"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.250017 4939 generic.go:334] "Generic (PLEG): container finished" podID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerID="f026dc909ac22d1539817ec6b0e34f51819c81cb961582e5bb0f18556dfcbf46" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.250089 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerDied","Data":"f026dc909ac22d1539817ec6b0e34f51819c81cb961582e5bb0f18556dfcbf46"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.255030 4939 generic.go:334] "Generic (PLEG): container finished" podID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerID="c14895b390882b2c1923b6999a196557393e0c01f808b080026c9ef7f7e1d1fd" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.255103 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerDied","Data":"c14895b390882b2c1923b6999a196557393e0c01f808b080026c9ef7f7e1d1fd"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.271864 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8brg8"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.271961 4939 generic.go:334] "Generic (PLEG): container finished" podID="50baf265-a6d8-445d-aed6-853781644d9e" containerID="fe8d4154643530aaa76761b4600cb2e025a90e8cdab5159d3a196d0d9c425f73" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.272060 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerDied","Data":"fe8d4154643530aaa76761b4600cb2e025a90e8cdab5159d3a196d0d9c425f73"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.273894 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.274294 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.275363 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjqp\" (UniqueName: \"kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.276118 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.776097079 +0000 UTC m=+1629.375284700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.278224 4939 projected.go:194] Error preparing data for projected volume kube-api-access-gfjqp for pod openstack/keystone-fdf2-account-create-update-g58kg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.278274 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.778260639 +0000 UTC m=+1629.377448260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gfjqp" (UniqueName: "kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.278964 4939 generic.go:334] "Generic (PLEG): container finished" podID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" containerID="286772eca49783d61fb3dec6079a9246af7f326c26fda03b975b7c1244c77555" exitCode=2 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.279051 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51b567b0-935e-46f6-8cf7-3c8a9040bad4","Type":"ContainerDied","Data":"286772eca49783d61fb3dec6079a9246af7f326c26fda03b975b7c1244c77555"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.286851 4939 generic.go:334] "Generic (PLEG): container finished" podID="df7cba1f-8d56-47c9-8016-3184a1374386" containerID="ee0d7673467ab34937d02099e23c0b13b6599f673de277c77448b47b6c8d53d7" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.287107 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerDied","Data":"ee0d7673467ab34937d02099e23c0b13b6599f673de277c77448b47b6c8d53d7"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.290670 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.291355 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09","Type":"ContainerDied","Data":"2061dbed6e2b7aee40aa92639815309420d823bc42cfca582e1ecebebd707861"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.291406 4939 scope.go:117] "RemoveContainer" containerID="7e85fcda993466abefcf0bfc67ff33a61bec9760e923f7c8c0a74450b7858ed8" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.327236 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.327614 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-56c996c794-vrkm4" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" containerName="keystone-api" containerID="cri-o://add0a3162f0fb4bc567ed074ad66216e38747696a6b5e808eb5932b5e0024e79" gracePeriod=30 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.344958 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cron-29564161-466kv"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.345473 4939 generic.go:334] "Generic (PLEG): container finished" podID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerID="a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4" exitCode=1 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.346157 4939 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-2dr5q" secret="" err="secret \"galera-openstack-dockercfg-6942g\" not found" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.346557 4939 scope.go:117] "RemoveContainer" containerID="a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.348234 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dr5q" event={"ID":"8269f4a0-d0d4-4620-9c3e-885d453b7109","Type":"ContainerDied","Data":"a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.348299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dr5q" event={"ID":"8269f4a0-d0d4-4620-9c3e-885d453b7109","Type":"ContainerStarted","Data":"42d22f84975155ae48997983b773b266b1c39289bc54f3c2d88bba19fa1064ce"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.348255 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-576956754b-kspq2" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.377475 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.383248 4939 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.383339 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts podName:f7a5f60f-451f-45ca-ad9d-62dc13bccf66 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:28.383315262 +0000 UTC m=+1632.982502883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts") pod "root-account-create-update-ljqrx" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66") : configmap "openstack-cell1-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.397114 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.407352 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fdf2-account-create-update-g58kg"] Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.407736 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-gfjqp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-fdf2-account-create-update-g58kg" podUID="c0377ebe-7993-4c09-aaa0-908628a881c4" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.417353 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f7sh4"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.426640 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f7sh4"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.443220 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.451875 4939 generic.go:334] "Generic (PLEG): container finished" podID="a2d02491-90d4-41b4-884d-0959feb366b0" containerID="141032193e0acc0c48dcf5467fc8003161e9b27e1dbd08d3df9c676f4ba3145c" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.451952 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerDied","Data":"141032193e0acc0c48dcf5467fc8003161e9b27e1dbd08d3df9c676f4ba3145c"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.470162 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.473821 4939 generic.go:334] "Generic (PLEG): container finished" podID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerID="a06a089fed14337403eec84f45e3b078fba61e58eeaa6ebe0c2a5e8ebeea031d" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.473911 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerDied","Data":"a06a089fed14337403eec84f45e3b078fba61e58eeaa6ebe0c2a5e8ebeea031d"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.478299 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479003 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479060 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts\") pod \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479146 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479230 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmb9s\" (UniqueName: \"kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s\") pod \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\" (UID: \"f7a5f60f-451f-45ca-ad9d-62dc13bccf66\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479271 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479344 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479388 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n48zb\" (UniqueName: \"kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479471 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.479626 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts\") pod \"df7cba1f-8d56-47c9-8016-3184a1374386\" (UID: \"df7cba1f-8d56-47c9-8016-3184a1374386\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.496656 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.497255 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.497323 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts podName:8269f4a0-d0d4-4620-9c3e-885d453b7109 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:24.997306006 +0000 UTC m=+1629.596493627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts") pod "root-account-create-update-2dr5q" (UID: "8269f4a0-d0d4-4620-9c3e-885d453b7109") : configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.500056 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs" (OuterVolumeSpecName: "logs") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.508832 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s" (OuterVolumeSpecName: "kube-api-access-mmb9s") pod "f7a5f60f-451f-45ca-ad9d-62dc13bccf66" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66"). InnerVolumeSpecName "kube-api-access-mmb9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.509481 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7a5f60f-451f-45ca-ad9d-62dc13bccf66" (UID: "f7a5f60f-451f-45ca-ad9d-62dc13bccf66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.531109 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb" (OuterVolumeSpecName: "kube-api-access-n48zb") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "kube-api-access-n48zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.533888 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts" (OuterVolumeSpecName: "scripts") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.577109 4939 generic.go:334] "Generic (PLEG): container finished" podID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerID="ce5cff00a735d1a1ef2cc4c115a9ac8265f9778ce6a2da6ac270a33f9cc7daf5" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.577173 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerDied","Data":"ce5cff00a735d1a1ef2cc4c115a9ac8265f9778ce6a2da6ac270a33f9cc7daf5"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.582171 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts\") pod \"77b48752-eea3-4627-8da2-737f8bd7b36a\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.582492 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78tlr\" (UniqueName: \"kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr\") pod \"77b48752-eea3-4627-8da2-737f8bd7b36a\" (UID: \"77b48752-eea3-4627-8da2-737f8bd7b36a\") " Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.582660 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77b48752-eea3-4627-8da2-737f8bd7b36a" (UID: "77b48752-eea3-4627-8da2-737f8bd7b36a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.583526 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b48752-eea3-4627-8da2-737f8bd7b36a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.584376 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.584476 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.584596 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmb9s\" (UniqueName: \"kubernetes.io/projected/f7a5f60f-451f-45ca-ad9d-62dc13bccf66-kube-api-access-mmb9s\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.584676 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7cba1f-8d56-47c9-8016-3184a1374386-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.584747 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n48zb\" (UniqueName: \"kubernetes.io/projected/df7cba1f-8d56-47c9-8016-3184a1374386-kube-api-access-n48zb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.586939 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr" (OuterVolumeSpecName: "kube-api-access-78tlr") pod "77b48752-eea3-4627-8da2-737f8bd7b36a" (UID: "77b48752-eea3-4627-8da2-737f8bd7b36a"). InnerVolumeSpecName "kube-api-access-78tlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.587197 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerID="f3079f7fe5149bacaed20469d6bc740ba824c2bdc67fa400c6cfad0cecdc52c1" exitCode=0 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.587230 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerID="bcaf1ad5a30e3e16337fdbd45932541ca349149870b4da32747001a8591240e1" exitCode=2 Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.587257 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerDied","Data":"f3079f7fe5149bacaed20469d6bc740ba824c2bdc67fa400c6cfad0cecdc52c1"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.587287 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerDied","Data":"bcaf1ad5a30e3e16337fdbd45932541ca349149870b4da32747001a8591240e1"} Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.687665 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data" (OuterVolumeSpecName: "config-data") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.688893 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78tlr\" (UniqueName: \"kubernetes.io/projected/77b48752-eea3-4627-8da2-737f8bd7b36a-kube-api-access-78tlr\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.688913 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.706791 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.763733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.791647 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.791766 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjqp\" (UniqueName: \"kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.791923 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.791938 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.792349 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.792395 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:25.792378329 +0000 UTC m=+1630.391565950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : configmap "openstack-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.803834 4939 projected.go:194] Error preparing data for projected volume kube-api-access-gfjqp for pod openstack/keystone-fdf2-account-create-update-g58kg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.803908 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:25.803889712 +0000 UTC m=+1630.403077333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfjqp" (UniqueName: "kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.831254 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df7cba1f-8d56-47c9-8016-3184a1374386" (UID: "df7cba1f-8d56-47c9-8016-3184a1374386"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.893829 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7cba1f-8d56-47c9-8016-3184a1374386-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.893923 4939 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.893968 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:32.893954984 +0000 UTC m=+1637.493142605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-config-data" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.894268 4939 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: E0318 16:04:24.894294 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts podName:5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:32.894286183 +0000 UTC m=+1637.493473804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts") pod "cinder-api-0" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2") : secret "cinder-scripts" not found Mar 18 16:04:24 crc kubenswrapper[4939]: I0318 16:04:24.940706 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="galera" containerID="cri-o://4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb" gracePeriod=30 Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.099068 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.103807 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts podName:8269f4a0-d0d4-4620-9c3e-885d453b7109 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:26.10376395 +0000 UTC m=+1630.702951571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts") pod "root-account-create-update-2dr5q" (UID: "8269f4a0-d0d4-4620-9c3e-885d453b7109") : configmap "openstack-scripts" not found Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.154721 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.244125 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.261223 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.265726 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.284875 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.299832 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311065 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs\") pod \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311165 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9zwm\" (UniqueName: \"kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm\") pod \"abc847c2-3903-44d4-aa4d-0a7e16709041\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311196 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcz5b\" (UniqueName: \"kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b\") pod \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data\") pod \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311254 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs\") pod \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311284 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle\") pod \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\" (UID: \"e71cb7a9-1ab5-4596-901f-314dcfae2bc4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.311327 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts\") pod \"abc847c2-3903-44d4-aa4d-0a7e16709041\" (UID: \"abc847c2-3903-44d4-aa4d-0a7e16709041\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.312933 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs" (OuterVolumeSpecName: "logs") pod "e71cb7a9-1ab5-4596-901f-314dcfae2bc4" (UID: "e71cb7a9-1ab5-4596-901f-314dcfae2bc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.325325 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.326912 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.330910 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abc847c2-3903-44d4-aa4d-0a7e16709041" (UID: "abc847c2-3903-44d4-aa4d-0a7e16709041"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.333426 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm" (OuterVolumeSpecName: "kube-api-access-m9zwm") pod "abc847c2-3903-44d4-aa4d-0a7e16709041" (UID: "abc847c2-3903-44d4-aa4d-0a7e16709041"). InnerVolumeSpecName "kube-api-access-m9zwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.333895 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b" (OuterVolumeSpecName: "kube-api-access-gcz5b") pod "e71cb7a9-1ab5-4596-901f-314dcfae2bc4" (UID: "e71cb7a9-1ab5-4596-901f-314dcfae2bc4"). InnerVolumeSpecName "kube-api-access-gcz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.355845 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data" (OuterVolumeSpecName: "config-data") pod "e71cb7a9-1ab5-4596-901f-314dcfae2bc4" (UID: "e71cb7a9-1ab5-4596-901f-314dcfae2bc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.358804 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e71cb7a9-1ab5-4596-901f-314dcfae2bc4" (UID: "e71cb7a9-1ab5-4596-901f-314dcfae2bc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.373548 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e71cb7a9-1ab5-4596-901f-314dcfae2bc4" (UID: "e71cb7a9-1ab5-4596-901f-314dcfae2bc4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413296 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413355 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413382 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413411 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413437 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413477 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs\") pod \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413523 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413558 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413599 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413634 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413683 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413709 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413732 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413771 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config\") pod \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413798 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr7sq\" (UniqueName: \"kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413824 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdp57\" (UniqueName: \"kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413846 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413872 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8df8\" (UniqueName: \"kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413896 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413918 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413943 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-456r8\" (UniqueName: \"kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.413975 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414002 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414023 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414055 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414085 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414114 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414263 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414292 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cb92b15e-a854-4505-97e2-37e4a7b821b4\" (UID: \"cb92b15e-a854-4505-97e2-37e4a7b821b4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414384 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle\") pod \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414414 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89khq\" (UniqueName: \"kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414441 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414479 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414539 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414621 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414696 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414769 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data\") pod \"a2d02491-90d4-41b4-884d-0959feb366b0\" (UID: \"a2d02491-90d4-41b4-884d-0959feb366b0\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414795 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs\") pod \"42a70df8-1617-448d-9495-5aa55d8b97fb\" (UID: \"42a70df8-1617-448d-9495-5aa55d8b97fb\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414860 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs\") pod \"50baf265-a6d8-445d-aed6-853781644d9e\" (UID: \"50baf265-a6d8-445d-aed6-853781644d9e\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414892 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhhvh\" (UniqueName: \"kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh\") pod \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\" (UID: \"51b567b0-935e-46f6-8cf7-3c8a9040bad4\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.414934 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs\") pod \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\" (UID: \"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2\") " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415551 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415571 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9zwm\" (UniqueName: \"kubernetes.io/projected/abc847c2-3903-44d4-aa4d-0a7e16709041-kube-api-access-m9zwm\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415586 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcz5b\" (UniqueName: \"kubernetes.io/projected/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-kube-api-access-gcz5b\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415632 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415644 4939 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415657 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71cb7a9-1ab5-4596-901f-314dcfae2bc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.415669 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abc847c2-3903-44d4-aa4d-0a7e16709041-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.416623 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs" (OuterVolumeSpecName: "logs") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.432777 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts" (OuterVolumeSpecName: "scripts") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.434673 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.434717 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.435200 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.439493 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs" (OuterVolumeSpecName: "logs") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.442431 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq" (OuterVolumeSpecName: "kube-api-access-89khq") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "kube-api-access-89khq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.442535 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8" (OuterVolumeSpecName: "kube-api-access-456r8") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "kube-api-access-456r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.448835 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs" (OuterVolumeSpecName: "logs") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.449885 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs" (OuterVolumeSpecName: "logs") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.451078 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs" (OuterVolumeSpecName: "logs") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.453645 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.453760 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh" (OuterVolumeSpecName: "kube-api-access-zhhvh") pod "51b567b0-935e-46f6-8cf7-3c8a9040bad4" (UID: "51b567b0-935e-46f6-8cf7-3c8a9040bad4"). InnerVolumeSpecName "kube-api-access-zhhvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.453782 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts" (OuterVolumeSpecName: "scripts") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.460007 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.462062 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.463412 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8" (OuterVolumeSpecName: "kube-api-access-f8df8") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "kube-api-access-f8df8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.471524 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57" (OuterVolumeSpecName: "kube-api-access-rdp57") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "kube-api-access-rdp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.471584 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq" (OuterVolumeSpecName: "kube-api-access-wr7sq") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "kube-api-access-wr7sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.478879 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.479319 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts" (OuterVolumeSpecName: "scripts") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.489702 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521023 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521072 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhhvh\" (UniqueName: \"kubernetes.io/projected/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-api-access-zhhvh\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521083 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521094 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521102 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521307 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521321 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42a70df8-1617-448d-9495-5aa55d8b97fb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521330 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521343 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521351 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521361 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdp57\" (UniqueName: \"kubernetes.io/projected/50baf265-a6d8-445d-aed6-853781644d9e-kube-api-access-rdp57\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521371 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr7sq\" (UniqueName: \"kubernetes.io/projected/a2d02491-90d4-41b4-884d-0959feb366b0-kube-api-access-wr7sq\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.521380 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522433 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8df8\" (UniqueName: \"kubernetes.io/projected/cb92b15e-a854-4505-97e2-37e4a7b821b4-kube-api-access-f8df8\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522444 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d02491-90d4-41b4-884d-0959feb366b0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522455 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-456r8\" (UniqueName: \"kubernetes.io/projected/42a70df8-1617-448d-9495-5aa55d8b97fb-kube-api-access-456r8\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522609 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522620 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522630 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb92b15e-a854-4505-97e2-37e4a7b821b4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522646 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522655 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89khq\" (UniqueName: \"kubernetes.io/projected/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-kube-api-access-89khq\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.522762 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50baf265-a6d8-445d-aed6-853781644d9e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.532911 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.532982 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data podName:d850ac81-a29e-4e93-9fab-72b6325de52e nodeName:}" failed. No retries permitted until 2026-03-18 16:04:33.532964743 +0000 UTC m=+1638.132152364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data") pod "rabbitmq-cell1-server-0" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e") : configmap "rabbitmq-cell1-config-data" not found Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.582161 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data" (OuterVolumeSpecName: "config-data") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.587651 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data" (OuterVolumeSpecName: "config-data") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.604041 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ljqrx" event={"ID":"f7a5f60f-451f-45ca-ad9d-62dc13bccf66","Type":"ContainerDied","Data":"b0038580c011bc3f543685811c4a607d8018f0e6d1a763d9ca6d99b790f313bc"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.604151 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ljqrx" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.606969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51b567b0-935e-46f6-8cf7-3c8a9040bad4","Type":"ContainerDied","Data":"eddc21d2a7edc5f56890f75c04a8c25d5d8793651129b3625fe9c7fadd124f43"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.607053 4939 scope.go:117] "RemoveContainer" containerID="286772eca49783d61fb3dec6079a9246af7f326c26fda03b975b7c1244c77555" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.607366 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.614310 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74465b498-l8mz2" event={"ID":"42a70df8-1617-448d-9495-5aa55d8b97fb","Type":"ContainerDied","Data":"c76fec3e2cccd03cb051ff2225ce195ca2f461423faa05bec5aa9aa4f34355e1"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.614399 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74465b498-l8mz2" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.622615 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.622669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8faa-account-create-update-ztzzs" event={"ID":"abc847c2-3903-44d4-aa4d-0a7e16709041","Type":"ContainerDied","Data":"1fdb9853930da2cb76c945d655bd4514cd8df54d11038f39529feae5264ca418"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.627742 4939 generic.go:334] "Generic (PLEG): container finished" podID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" containerID="7ef584cd49ae7f589ecbcd4e4179803996e249f944e8b538430057de2525fe26" exitCode=0 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.627849 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"658fa1ab-1e7e-42d2-947e-6c74215e15f0","Type":"ContainerDied","Data":"7ef584cd49ae7f589ecbcd4e4179803996e249f944e8b538430057de2525fe26"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.628785 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.628815 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.635230 4939 generic.go:334] "Generic (PLEG): container finished" podID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerID="e02a12f9caf7aff3ab019570b585d66908b0f4cfe93a1f0a694c9e99cffe1bfe" exitCode=0 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.635291 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerDied","Data":"e02a12f9caf7aff3ab019570b585d66908b0f4cfe93a1f0a694c9e99cffe1bfe"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.637439 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.640193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2d02491-90d4-41b4-884d-0959feb366b0","Type":"ContainerDied","Data":"c30659c15496ac9423c4d42b78ceb86e966224e3c7aeda97df07bde6a2936fd1"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.640303 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.642248 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.645925 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.646553 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2","Type":"ContainerDied","Data":"198a66caefc306f6c725fa20dadae0d7c24284e28fa124b4656b6d622bddba95"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.648965 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb92b15e-a854-4505-97e2-37e4a7b821b4","Type":"ContainerDied","Data":"db4849851f296a4419721c8e278fdc257d8f44daa292632908cb4dce6239c3c3"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.649022 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.651283 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-409c-account-create-update-m2tw5" event={"ID":"77b48752-eea3-4627-8da2-737f8bd7b36a","Type":"ContainerDied","Data":"fb87efe8d584162abc79b46c3d88ff6a47fa0dfd2c42d747763515b32dea55b8"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.651472 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-409c-account-create-update-m2tw5" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.657409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e71cb7a9-1ab5-4596-901f-314dcfae2bc4","Type":"ContainerDied","Data":"e0ca74603485f889d9836ee5d92affacde08b2e5a2d3ff1f9faad774da6a5f5b"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.657527 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.660595 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "51b567b0-935e-46f6-8cf7-3c8a9040bad4" (UID: "51b567b0-935e-46f6-8cf7-3c8a9040bad4"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.664564 4939 generic.go:334] "Generic (PLEG): container finished" podID="86474b5e-6fc8-4810-a083-699878062ade" containerID="50ab95e7c021b4416c51b15d476a07a0ef51bd8ab440f9053107700d4fa85ea2" exitCode=0 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.664614 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerDied","Data":"50ab95e7c021b4416c51b15d476a07a0ef51bd8ab440f9053107700d4fa85ea2"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.680893 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.689741 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50baf265-a6d8-445d-aed6-853781644d9e","Type":"ContainerDied","Data":"8f0d5db536a4d73516c5e6363d7d361d8a0c0340543756582492aeebc2aa0eb4"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.689837 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.706489 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dr5q" event={"ID":"8269f4a0-d0d4-4620-9c3e-885d453b7109","Type":"ContainerStarted","Data":"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.706701 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/root-account-create-update-2dr5q" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" containerID="cri-o://684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad" gracePeriod=30 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.709460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.733340 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.733818 4939 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734061 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734079 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734089 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734812 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerID="cefcf8484139882612cbf2acdccd98b003d26c82bc17372d2b00824a4b7ba550" exitCode=0 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734857 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerID="8f7b8af072015fc83aa897993785d0ba6ece0ebbc142762b52dcd98324367fa6" exitCode=0 Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.734962 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerDied","Data":"cefcf8484139882612cbf2acdccd98b003d26c82bc17372d2b00824a4b7ba550"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.735075 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerDied","Data":"8f7b8af072015fc83aa897993785d0ba6ece0ebbc142762b52dcd98324367fa6"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.737809 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2dr5q" podStartSLOduration=4.73778967 podStartE2EDuration="4.73778967s" podCreationTimestamp="2026-03-18 16:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:04:25.735380622 +0000 UTC m=+1630.334568243" watchObservedRunningTime="2026-03-18 16:04:25.73778967 +0000 UTC m=+1630.336977291" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.748936 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.750301 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-576956754b-kspq2" event={"ID":"df7cba1f-8d56-47c9-8016-3184a1374386","Type":"ContainerDied","Data":"bf4890f5149a26c46143ec38e6c1c16ded9f83634e51531317a6a653ef4fa6fa"} Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.754598 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-576956754b-kspq2" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.801701 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data" (OuterVolumeSpecName: "config-data") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.804916 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "51b567b0-935e-46f6-8cf7-3c8a9040bad4" (UID: "51b567b0-935e-46f6-8cf7-3c8a9040bad4"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.817561 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.831088 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51b567b0-935e-46f6-8cf7-3c8a9040bad4" (UID: "51b567b0-935e-46f6-8cf7-3c8a9040bad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.831223 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data" (OuterVolumeSpecName: "config-data") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835426 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835591 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjqp\" (UniqueName: \"kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp\") pod \"keystone-fdf2-account-create-update-g58kg\" (UID: \"c0377ebe-7993-4c09-aaa0-908628a881c4\") " pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835708 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835723 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835717 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835735 4939 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835797 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.835815 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b567b0-935e-46f6-8cf7-3c8a9040bad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.835887 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.835940 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:27.835921408 +0000 UTC m=+1632.435109079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : configmap "openstack-scripts" not found Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.836332 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" (UID: "5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.839279 4939 projected.go:194] Error preparing data for projected volume kube-api-access-gfjqp for pod openstack/keystone-fdf2-account-create-update-g58kg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:25 crc kubenswrapper[4939]: E0318 16:04:25.839345 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp podName:c0377ebe-7993-4c09-aaa0-908628a881c4 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:27.839326204 +0000 UTC m=+1632.438513905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfjqp" (UniqueName: "kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp") pod "keystone-fdf2-account-create-update-g58kg" (UID: "c0377ebe-7993-4c09-aaa0-908628a881c4") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.854313 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data" (OuterVolumeSpecName: "config-data") pod "cb92b15e-a854-4505-97e2-37e4a7b821b4" (UID: "cb92b15e-a854-4505-97e2-37e4a7b821b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.857725 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.872269 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "42a70df8-1617-448d-9495-5aa55d8b97fb" (UID: "42a70df8-1617-448d-9495-5aa55d8b97fb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.880129 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.884489 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50baf265-a6d8-445d-aed6-853781644d9e" (UID: "50baf265-a6d8-445d-aed6-853781644d9e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.888809 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.898619 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.915262 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a2d02491-90d4-41b4-884d-0959feb366b0" (UID: "a2d02491-90d4-41b4-884d-0959feb366b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937929 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937962 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937972 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937982 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937991 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb92b15e-a854-4505-97e2-37e4a7b821b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.937998 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d02491-90d4-41b4-884d-0959feb366b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.938006 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.938015 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.938023 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a70df8-1617-448d-9495-5aa55d8b97fb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:25 crc kubenswrapper[4939]: I0318 16:04:25.938031 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50baf265-a6d8-445d-aed6-853781644d9e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.140776 4939 scope.go:117] "RemoveContainer" containerID="c14895b390882b2c1923b6999a196557393e0c01f808b080026c9ef7f7e1d1fd" Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.142451 4939 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.142530 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts podName:8269f4a0-d0d4-4620-9c3e-885d453b7109 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:28.142494593 +0000 UTC m=+1632.741682214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts") pod "root-account-create-update-2dr5q" (UID: "8269f4a0-d0d4-4620-9c3e-885d453b7109") : configmap "openstack-scripts" not found Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.149115 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.159144 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0223f3cb-46bb-4bb1-aa44-b6b259a559f5" path="/var/lib/kubelet/pods/0223f3cb-46bb-4bb1-aa44-b6b259a559f5/volumes" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.159700 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.161026 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834aca75-038f-4aed-8d55-bb1924b96934" path="/var/lib/kubelet/pods/834aca75-038f-4aed-8d55-bb1924b96934/volumes" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.163459 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="961b9311-60b6-40e9-839a-a40bb6859bb3" path="/var/lib/kubelet/pods/961b9311-60b6-40e9-839a-a40bb6859bb3/volumes" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.165407 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09" path="/var/lib/kubelet/pods/bcf5f8fb-13d6-4c9b-a1e8-5ac3e558be09/volumes" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.231906 4939 scope.go:117] "RemoveContainer" containerID="c8f94483b5054ecb13ba070dc607cd5aefe23c9ba71716aa8ae0e310aed70d7c" Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.236853 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.236978 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.237225 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.239830 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.242749 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245084 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245164 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data\") pod \"8c83b398-2fa8-4862-a2fe-6f66e3200216\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245183 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle\") pod \"8c83b398-2fa8-4862-a2fe-6f66e3200216\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245252 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs\") pod \"8c83b398-2fa8-4862-a2fe-6f66e3200216\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245294 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.245313 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.245352 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x5vr\" (UniqueName: \"kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245758 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245780 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245808 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245834 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-585cb\" (UniqueName: \"kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb\") pod \"8c83b398-2fa8-4862-a2fe-6f66e3200216\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245878 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245901 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom\") pod \"8c83b398-2fa8-4862-a2fe-6f66e3200216\" (UID: \"8c83b398-2fa8-4862-a2fe-6f66e3200216\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.245939 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd\") pod \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\" (UID: \"1b6d7c3f-1c09-4bbd-8de1-df304376c198\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.265858 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr" (OuterVolumeSpecName: "kube-api-access-4x5vr") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "kube-api-access-4x5vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.269687 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs" (OuterVolumeSpecName: "logs") pod "8c83b398-2fa8-4862-a2fe-6f66e3200216" (UID: "8c83b398-2fa8-4862-a2fe-6f66e3200216"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.271205 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:04:26 crc kubenswrapper[4939]: E0318 16:04:26.271318 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.271877 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.281346 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.284804 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb" (OuterVolumeSpecName: "kube-api-access-585cb") pod "8c83b398-2fa8-4862-a2fe-6f66e3200216" (UID: "8c83b398-2fa8-4862-a2fe-6f66e3200216"). InnerVolumeSpecName "kube-api-access-585cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.288668 4939 scope.go:117] "RemoveContainer" containerID="141032193e0acc0c48dcf5467fc8003161e9b27e1dbd08d3df9c676f4ba3145c" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.316655 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.332719 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.336227 4939 scope.go:117] "RemoveContainer" containerID="4607f5748ea35230c9f8cb3745bf14422c3e81e916ca3f8b80878128c7ed0bca" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.349748 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c83b398-2fa8-4862-a2fe-6f66e3200216" (UID: "8c83b398-2fa8-4862-a2fe-6f66e3200216"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.349794 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts" (OuterVolumeSpecName: "scripts") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.349990 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.355753 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.364271 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q26g9\" (UniqueName: \"kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9\") pod \"86474b5e-6fc8-4810-a083-699878062ade\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.364322 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data\") pod \"86474b5e-6fc8-4810-a083-699878062ade\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.364403 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle\") pod \"86474b5e-6fc8-4810-a083-699878062ade\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.364450 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs\") pod \"86474b5e-6fc8-4810-a083-699878062ade\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.364613 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom\") pod \"86474b5e-6fc8-4810-a083-699878062ade\" (UID: \"86474b5e-6fc8-4810-a083-699878062ade\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365014 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365028 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365040 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365052 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c83b398-2fa8-4862-a2fe-6f66e3200216-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365062 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x5vr\" (UniqueName: \"kubernetes.io/projected/1b6d7c3f-1c09-4bbd-8de1-df304376c198-kube-api-access-4x5vr\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365074 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b6d7c3f-1c09-4bbd-8de1-df304376c198-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.365084 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-585cb\" (UniqueName: \"kubernetes.io/projected/8c83b398-2fa8-4862-a2fe-6f66e3200216-kube-api-access-585cb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.376666 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs" (OuterVolumeSpecName: "logs") pod "86474b5e-6fc8-4810-a083-699878062ade" (UID: "86474b5e-6fc8-4810-a083-699878062ade"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.377106 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86474b5e-6fc8-4810-a083-699878062ade" (UID: "86474b5e-6fc8-4810-a083-699878062ade"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.379639 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9" (OuterVolumeSpecName: "kube-api-access-q26g9") pod "86474b5e-6fc8-4810-a083-699878062ade" (UID: "86474b5e-6fc8-4810-a083-699878062ade"). InnerVolumeSpecName "kube-api-access-q26g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.389862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.391546 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.407647 4939 scope.go:117] "RemoveContainer" containerID="ce5cff00a735d1a1ef2cc4c115a9ac8265f9778ce6a2da6ac270a33f9cc7daf5" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.422579 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.439382 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18740c60-7bc8-4daa-a426-1aa624b7ac8a/ovn-northd/0.log" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.439480 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.465930 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config\") pod \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466006 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h28v9\" (UniqueName: \"kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9\") pod \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466033 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle\") pod \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466061 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs\") pod \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466128 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts\") pod \"8269f4a0-d0d4-4620-9c3e-885d453b7109\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466228 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data\") pod \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\" (UID: \"658fa1ab-1e7e-42d2-947e-6c74215e15f0\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466254 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dgg4\" (UniqueName: \"kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4\") pod \"8269f4a0-d0d4-4620-9c3e-885d453b7109\" (UID: \"8269f4a0-d0d4-4620-9c3e-885d453b7109\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466667 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466691 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q26g9\" (UniqueName: \"kubernetes.io/projected/86474b5e-6fc8-4810-a083-699878062ade-kube-api-access-q26g9\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466705 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.466716 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86474b5e-6fc8-4810-a083-699878062ade-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.468200 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "658fa1ab-1e7e-42d2-947e-6c74215e15f0" (UID: "658fa1ab-1e7e-42d2-947e-6c74215e15f0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.468871 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8269f4a0-d0d4-4620-9c3e-885d453b7109" (UID: "8269f4a0-d0d4-4620-9c3e-885d453b7109"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.469353 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data" (OuterVolumeSpecName: "config-data") pod "658fa1ab-1e7e-42d2-947e-6c74215e15f0" (UID: "658fa1ab-1e7e-42d2-947e-6c74215e15f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.471302 4939 scope.go:117] "RemoveContainer" containerID="aba281f036d2b5e4f340ee24dde5ab36bf1f763f7ca8377787fe8cdc8c7f40da" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.476935 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c83b398-2fa8-4862-a2fe-6f66e3200216" (UID: "8c83b398-2fa8-4862-a2fe-6f66e3200216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.481331 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4" (OuterVolumeSpecName: "kube-api-access-8dgg4") pod "8269f4a0-d0d4-4620-9c3e-885d453b7109" (UID: "8269f4a0-d0d4-4620-9c3e-885d453b7109"). InnerVolumeSpecName "kube-api-access-8dgg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.491016 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-409c-account-create-update-m2tw5"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.507657 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.509562 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9" (OuterVolumeSpecName: "kube-api-access-h28v9") pod "658fa1ab-1e7e-42d2-947e-6c74215e15f0" (UID: "658fa1ab-1e7e-42d2-947e-6c74215e15f0"). InnerVolumeSpecName "kube-api-access-h28v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568346 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568476 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568575 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568681 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568704 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.568906 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.569080 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62f5k\" (UniqueName: \"kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.569112 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts\") pod \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\" (UID: \"18740c60-7bc8-4daa-a426-1aa624b7ac8a\") " Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.569291 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config" (OuterVolumeSpecName: "config") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.569879 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts" (OuterVolumeSpecName: "scripts") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570023 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570085 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dgg4\" (UniqueName: \"kubernetes.io/projected/8269f4a0-d0d4-4620-9c3e-885d453b7109-kube-api-access-8dgg4\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570098 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570107 4939 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570117 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570127 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h28v9\" (UniqueName: \"kubernetes.io/projected/658fa1ab-1e7e-42d2-947e-6c74215e15f0-kube-api-access-h28v9\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570136 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8269f4a0-d0d4-4620-9c3e-885d453b7109-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.570144 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.581396 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.586698 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.590097 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k" (OuterVolumeSpecName: "kube-api-access-62f5k") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "kube-api-access-62f5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.594285 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data" (OuterVolumeSpecName: "config-data") pod "8c83b398-2fa8-4862-a2fe-6f66e3200216" (UID: "8c83b398-2fa8-4862-a2fe-6f66e3200216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.594411 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.596740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data" (OuterVolumeSpecName: "config-data") pod "86474b5e-6fc8-4810-a083-699878062ade" (UID: "86474b5e-6fc8-4810-a083-699878062ade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.635595 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.646413 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8faa-account-create-update-ztzzs"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.648074 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86474b5e-6fc8-4810-a083-699878062ade" (UID: "86474b5e-6fc8-4810-a083-699878062ade"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.668616 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.673723 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62f5k\" (UniqueName: \"kubernetes.io/projected/18740c60-7bc8-4daa-a426-1aa624b7ac8a-kube-api-access-62f5k\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.673756 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18740c60-7bc8-4daa-a426-1aa624b7ac8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.673765 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c83b398-2fa8-4862-a2fe-6f66e3200216-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.673774 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.673785 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86474b5e-6fc8-4810-a083-699878062ade-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.676717 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.681680 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-74465b498-l8mz2"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.686712 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "658fa1ab-1e7e-42d2-947e-6c74215e15f0" (UID: "658fa1ab-1e7e-42d2-947e-6c74215e15f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.686933 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "658fa1ab-1e7e-42d2-947e-6c74215e15f0" (UID: "658fa1ab-1e7e-42d2-947e-6c74215e15f0"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.695176 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.711472 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.725827 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.735101 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.737612 4939 scope.go:117] "RemoveContainer" containerID="f026dc909ac22d1539817ec6b0e34f51819c81cb961582e5bb0f18556dfcbf46" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.752344 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.755011 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7bb7666d55-9qg76" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.175:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.755112 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7bb7666d55-9qg76" podUID="8de1bfe9-c6f0-46c0-bd41-318b139b0f41" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.175:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.779423 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.779456 4939 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.779469 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.779483 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.779494 4939 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/658fa1ab-1e7e-42d2-947e-6c74215e15f0-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.780105 4939 scope.go:117] "RemoveContainer" containerID="6e356ce75b4202de8978925650d9357e6b5b88d4f5afbb33cbd95353ff874608" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.780237 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-576956754b-kspq2"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.785927 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "18740c60-7bc8-4daa-a426-1aa624b7ac8a" (UID: "18740c60-7bc8-4daa-a426-1aa624b7ac8a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.792579 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.793211 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data" (OuterVolumeSpecName: "config-data") pod "1b6d7c3f-1c09-4bbd-8de1-df304376c198" (UID: "1b6d7c3f-1c09-4bbd-8de1-df304376c198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.800138 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b6d7c3f-1c09-4bbd-8de1-df304376c198","Type":"ContainerDied","Data":"8272265b72ee4d4872c95fe5c0e90555bf8c90f06e68e3b6dd6431d7c44b6235"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.800330 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.810179 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817396 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18740c60-7bc8-4daa-a426-1aa624b7ac8a/ovn-northd/0.log" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817586 4939 generic.go:334] "Generic (PLEG): container finished" podID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" exitCode=139 Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817749 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817811 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817813 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerDied","Data":"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.817847 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18740c60-7bc8-4daa-a426-1aa624b7ac8a","Type":"ContainerDied","Data":"cd17bd8a7f0793d03168519bbe0564c8c7b33342cf0477f35594b7efc8057f3b"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.820416 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" event={"ID":"8c83b398-2fa8-4862-a2fe-6f66e3200216","Type":"ContainerDied","Data":"f78e7cc886f047f7aedc44acf77b3d8c4ca77f8eb5d449ce7bb2f8e9d7d8e229"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.820539 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6659dc68fd-4444w" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.825088 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.827185 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.848873 4939 scope.go:117] "RemoveContainer" containerID="a06a089fed14337403eec84f45e3b078fba61e58eeaa6ebe0c2a5e8ebeea031d" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.849129 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" event={"ID":"86474b5e-6fc8-4810-a083-699878062ade","Type":"ContainerDied","Data":"8587f5770eb542c99226063dfc98d2ea099eeea1e39c26e3cdce054d48eb7293"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.849209 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b4578f6d7-lcqvz" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.857274 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"658fa1ab-1e7e-42d2-947e-6c74215e15f0","Type":"ContainerDied","Data":"94be137c43eea6a3392fa2ceb82cc16b626706fa6e74299e1194f94d3c664388"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.857377 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.868223 4939 generic.go:334] "Generic (PLEG): container finished" podID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerID="684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad" exitCode=1 Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.868292 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dr5q" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.868576 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dr5q" event={"ID":"8269f4a0-d0d4-4620-9c3e-885d453b7109","Type":"ContainerDied","Data":"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.868618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dr5q" event={"ID":"8269f4a0-d0d4-4620-9c3e-885d453b7109","Type":"ContainerDied","Data":"42d22f84975155ae48997983b773b266b1c39289bc54f3c2d88bba19fa1064ce"} Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.878815 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fdf2-account-create-update-g58kg" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.880632 4939 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18740c60-7bc8-4daa-a426-1aa624b7ac8a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.880653 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.880662 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b6d7c3f-1c09-4bbd-8de1-df304376c198-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.908404 4939 scope.go:117] "RemoveContainer" containerID="9935f89912bbc9ccc732826d2efbd0df5765843a5c1ca0671ee97d191011142c" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.926311 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.949002 4939 scope.go:117] "RemoveContainer" containerID="fe8d4154643530aaa76761b4600cb2e025a90e8cdab5159d3a196d0d9c425f73" Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.952473 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ljqrx"] Mar 18 16:04:26 crc kubenswrapper[4939]: I0318 16:04:26.984385 4939 scope.go:117] "RemoveContainer" containerID="d1edf6d434d6afaad7a360d12012a796e03fa6b5275394e5c67c30d4f75cf0c3" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.038563 4939 scope.go:117] "RemoveContainer" containerID="ee0d7673467ab34937d02099e23c0b13b6599f673de277c77448b47b6c8d53d7" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.041917 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.050929 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.059609 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.067332 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6659dc68fd-4444w"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.076987 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.081022 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2dr5q"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.096474 4939 scope.go:117] "RemoveContainer" containerID="c4b1ae5bcdd7929e16516d94f1d8e93e3c240e0dba2fbbe3ab7b4b2d344bbbb5" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.111895 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fdf2-account-create-update-g58kg"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.115740 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fdf2-account-create-update-g58kg"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.127933 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.161693 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.194187 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0377ebe-7993-4c09-aaa0-908628a881c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.194223 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjqp\" (UniqueName: \"kubernetes.io/projected/c0377ebe-7993-4c09-aaa0-908628a881c4-kube-api-access-gfjqp\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.202443 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.236067 4939 scope.go:117] "RemoveContainer" containerID="f3079f7fe5149bacaed20469d6bc740ba824c2bdc67fa400c6cfad0cecdc52c1" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.247963 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5b4578f6d7-lcqvz"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.255550 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.285275 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.342663 4939 scope.go:117] "RemoveContainer" containerID="bcaf1ad5a30e3e16337fdbd45932541ca349149870b4da32747001a8591240e1" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.377848 4939 scope.go:117] "RemoveContainer" containerID="cefcf8484139882612cbf2acdccd98b003d26c82bc17372d2b00824a4b7ba550" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.426849 4939 scope.go:117] "RemoveContainer" containerID="8f7b8af072015fc83aa897993785d0ba6ece0ebbc142762b52dcd98324367fa6" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.458480 4939 scope.go:117] "RemoveContainer" containerID="c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.525437 4939 scope.go:117] "RemoveContainer" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.549332 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-56c996c794-vrkm4" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.157:5000/v3\": read tcp 10.217.0.2:43702->10.217.0.157:5000: read: connection reset by peer" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.550948 4939 scope.go:117] "RemoveContainer" containerID="c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e" Mar 18 16:04:27 crc kubenswrapper[4939]: E0318 16:04:27.551999 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e\": container with ID starting with c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e not found: ID does not exist" containerID="c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.552072 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e"} err="failed to get container status \"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e\": rpc error: code = NotFound desc = could not find container \"c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e\": container with ID starting with c1cfd38d4740203f7790ca32d068d1f2b9170d3d4693eca8993ded58cba09b2e not found: ID does not exist" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.552093 4939 scope.go:117] "RemoveContainer" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" Mar 18 16:04:27 crc kubenswrapper[4939]: E0318 16:04:27.554975 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad\": container with ID starting with 0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad not found: ID does not exist" containerID="0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.555008 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad"} err="failed to get container status \"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad\": rpc error: code = NotFound desc = could not find container \"0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad\": container with ID starting with 0c119ff70e5aee57c518e17bc85be02c9eaf4b611d067425c25d663d217a44ad not found: ID does not exist" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.555125 4939 scope.go:117] "RemoveContainer" containerID="e02a12f9caf7aff3ab019570b585d66908b0f4cfe93a1f0a694c9e99cffe1bfe" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.590453 4939 scope.go:117] "RemoveContainer" containerID="e96d96e17fe2569029cec18b64c447837c38db0a15c1372e48c2e74d08a2fd50" Mar 18 16:04:27 crc kubenswrapper[4939]: E0318 16:04:27.812735 4939 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 16:04:27 crc kubenswrapper[4939]: E0318 16:04:27.812804 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data podName:26f60b5c-7d32-4fea-b3ca-a8132f3ed026 nodeName:}" failed. No retries permitted until 2026-03-18 16:04:35.812782567 +0000 UTC m=+1640.411970238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data") pod "rabbitmq-server-0" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026") : configmap "rabbitmq-config-data" not found Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.820930 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.827448 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.834981 4939 scope.go:117] "RemoveContainer" containerID="50ab95e7c021b4416c51b15d476a07a0ef51bd8ab440f9053107700d4fa85ea2" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.868967 4939 scope.go:117] "RemoveContainer" containerID="944a706537fc625973ddcc25ce21c69f44e3b1c0a43d6217239bde407bac3b36" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.902267 4939 scope.go:117] "RemoveContainer" containerID="7ef584cd49ae7f589ecbcd4e4179803996e249f944e8b538430057de2525fe26" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914206 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914279 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn9t5\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914346 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914372 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914449 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914473 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914518 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjst\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914572 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914597 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914643 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914668 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914692 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914716 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914766 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914789 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914812 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914845 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914871 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914903 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914928 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf\") pod \"d850ac81-a29e-4e93-9fab-72b6325de52e\" (UID: \"d850ac81-a29e-4e93-9fab-72b6325de52e\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.914953 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf\") pod \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\" (UID: \"26f60b5c-7d32-4fea-b3ca-a8132f3ed026\") " Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.915772 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.920231 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.920275 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.920937 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5" (OuterVolumeSpecName: "kube-api-access-fn9t5") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "kube-api-access-fn9t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.920981 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.921622 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.922200 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.923129 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.923582 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.926924 4939 generic.go:334] "Generic (PLEG): container finished" podID="878180f2-988b-4d66-aaf0-3429900f5e77" containerID="add0a3162f0fb4bc567ed074ad66216e38747696a6b5e808eb5932b5e0024e79" exitCode=0 Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.927002 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c996c794-vrkm4" event={"ID":"878180f2-988b-4d66-aaf0-3429900f5e77","Type":"ContainerDied","Data":"add0a3162f0fb4bc567ed074ad66216e38747696a6b5e808eb5932b5e0024e79"} Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.928534 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.929027 4939 generic.go:334] "Generic (PLEG): container finished" podID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerID="1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514" exitCode=0 Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.929187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerDied","Data":"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514"} Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.929282 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26f60b5c-7d32-4fea-b3ca-a8132f3ed026","Type":"ContainerDied","Data":"74b89c6b5b5ee7263a5ce777c34ae6ab461fca0d1cbf245329845c5513760fd1"} Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.929409 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.932088 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst" (OuterVolumeSpecName: "kube-api-access-7xjst") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "kube-api-access-7xjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.936870 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.945701 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info" (OuterVolumeSpecName: "pod-info") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.946167 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.955777 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.957054 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info" (OuterVolumeSpecName: "pod-info") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.961163 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data" (OuterVolumeSpecName: "config-data") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.965551 4939 scope.go:117] "RemoveContainer" containerID="684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.968781 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data" (OuterVolumeSpecName: "config-data") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.974199 4939 generic.go:334] "Generic (PLEG): container finished" podID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerID="a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f" exitCode=0 Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.974264 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerDied","Data":"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f"} Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.974297 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d850ac81-a29e-4e93-9fab-72b6325de52e","Type":"ContainerDied","Data":"758c253bcb3f90ef736746699cb1d3004b2714782bae77fd585422668a088008"} Mar 18 16:04:27 crc kubenswrapper[4939]: I0318 16:04:27.974382 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.012149 4939 scope.go:117] "RemoveContainer" containerID="a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.012683 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf" (OuterVolumeSpecName: "server-conf") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017280 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017310 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017319 4939 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d850ac81-a29e-4e93-9fab-72b6325de52e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017343 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017355 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017364 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017372 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017380 4939 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017387 4939 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d850ac81-a29e-4e93-9fab-72b6325de52e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017395 4939 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017402 4939 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017410 4939 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017417 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017426 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn9t5\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-kube-api-access-fn9t5\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017433 4939 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017446 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017455 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017463 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjst\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-kube-api-access-7xjst\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.017471 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.020225 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf" (OuterVolumeSpecName: "server-conf") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.042758 4939 scope.go:117] "RemoveContainer" containerID="684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.043260 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad\": container with ID starting with 684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad not found: ID does not exist" containerID="684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.043297 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad"} err="failed to get container status \"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad\": rpc error: code = NotFound desc = could not find container \"684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad\": container with ID starting with 684fc65c3e49ff28d794acdaa0ba17218d02fd647483a3aeb26986c149d166ad not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.043325 4939 scope.go:117] "RemoveContainer" containerID="a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.044017 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4\": container with ID starting with a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4 not found: ID does not exist" containerID="a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.044065 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4"} err="failed to get container status \"a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4\": rpc error: code = NotFound desc = could not find container \"a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4\": container with ID starting with a07c2ff79c43cd01e435c4e285600a34a2ff0327d325a0a598412a6f604882d4 not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.044099 4939 scope.go:117] "RemoveContainer" containerID="1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.050472 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.059244 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.080225 4939 scope.go:117] "RemoveContainer" containerID="7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.080492 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.088613 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "26f60b5c-7d32-4fea-b3ca-a8132f3ed026" (UID: "26f60b5c-7d32-4fea-b3ca-a8132f3ed026"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.118759 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26f60b5c-7d32-4fea-b3ca-a8132f3ed026-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.118797 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.118807 4939 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d850ac81-a29e-4e93-9fab-72b6325de52e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.118815 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.128694 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d850ac81-a29e-4e93-9fab-72b6325de52e" (UID: "d850ac81-a29e-4e93-9fab-72b6325de52e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.138317 4939 scope.go:117] "RemoveContainer" containerID="1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.138855 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514\": container with ID starting with 1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514 not found: ID does not exist" containerID="1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.138892 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514"} err="failed to get container status \"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514\": rpc error: code = NotFound desc = could not find container \"1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514\": container with ID starting with 1591c4feb42bdc7b9516f032780589383d56ed16f997bb5dcfa2fb4f9ba03514 not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.138916 4939 scope.go:117] "RemoveContainer" containerID="7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.140524 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52\": container with ID starting with 7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52 not found: ID does not exist" containerID="7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.140547 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52"} err="failed to get container status \"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52\": rpc error: code = NotFound desc = could not find container \"7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52\": container with ID starting with 7b4d651c80675b763287821ff87960c4ff0de35e5d3a687d39d7afbc78078d52 not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.140559 4939 scope.go:117] "RemoveContainer" containerID="a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.148404 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" path="/var/lib/kubelet/pods/18740c60-7bc8-4daa-a426-1aa624b7ac8a/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.150675 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" path="/var/lib/kubelet/pods/1b6d7c3f-1c09-4bbd-8de1-df304376c198/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.151445 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" path="/var/lib/kubelet/pods/42a70df8-1617-448d-9495-5aa55d8b97fb/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.152581 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50baf265-a6d8-445d-aed6-853781644d9e" path="/var/lib/kubelet/pods/50baf265-a6d8-445d-aed6-853781644d9e/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.153386 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" path="/var/lib/kubelet/pods/51b567b0-935e-46f6-8cf7-3c8a9040bad4/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.153989 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" path="/var/lib/kubelet/pods/5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.160897 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" path="/var/lib/kubelet/pods/658fa1ab-1e7e-42d2-947e-6c74215e15f0/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.161996 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b48752-eea3-4627-8da2-737f8bd7b36a" path="/var/lib/kubelet/pods/77b48752-eea3-4627-8da2-737f8bd7b36a/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.162336 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" path="/var/lib/kubelet/pods/8269f4a0-d0d4-4620-9c3e-885d453b7109/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.163625 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86474b5e-6fc8-4810-a083-699878062ade" path="/var/lib/kubelet/pods/86474b5e-6fc8-4810-a083-699878062ade/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.164762 4939 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 16:04:28 crc kubenswrapper[4939]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T16:04:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 16:04:28 crc kubenswrapper[4939]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Mar 18 16:04:28 crc kubenswrapper[4939]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-mp2sj" message=< Mar 18 16:04:28 crc kubenswrapper[4939]: Exiting ovn-controller (1) [FAILED] Mar 18 16:04:28 crc kubenswrapper[4939]: Killing ovn-controller (1) [ OK ] Mar 18 16:04:28 crc kubenswrapper[4939]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 18 16:04:28 crc kubenswrapper[4939]: 2026-03-18T16:04:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 16:04:28 crc kubenswrapper[4939]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Mar 18 16:04:28 crc kubenswrapper[4939]: > Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.164998 4939 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 16:04:28 crc kubenswrapper[4939]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T16:04:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 16:04:28 crc kubenswrapper[4939]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Mar 18 16:04:28 crc kubenswrapper[4939]: > pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" containerID="cri-o://cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.165033 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" containerID="cri-o://cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" gracePeriod=22 Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.166771 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" path="/var/lib/kubelet/pods/8c83b398-2fa8-4862-a2fe-6f66e3200216/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.167450 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" path="/var/lib/kubelet/pods/a2d02491-90d4-41b4-884d-0959feb366b0/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.170176 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc847c2-3903-44d4-aa4d-0a7e16709041" path="/var/lib/kubelet/pods/abc847c2-3903-44d4-aa4d-0a7e16709041/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.170486 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0377ebe-7993-4c09-aaa0-908628a881c4" path="/var/lib/kubelet/pods/c0377ebe-7993-4c09-aaa0-908628a881c4/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.170927 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" path="/var/lib/kubelet/pods/cb92b15e-a854-4505-97e2-37e4a7b821b4/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.171709 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" path="/var/lib/kubelet/pods/df7cba1f-8d56-47c9-8016-3184a1374386/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.172996 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" path="/var/lib/kubelet/pods/e71cb7a9-1ab5-4596-901f-314dcfae2bc4/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.173715 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a5f60f-451f-45ca-ad9d-62dc13bccf66" path="/var/lib/kubelet/pods/f7a5f60f-451f-45ca-ad9d-62dc13bccf66/volumes" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.200896 4939 scope.go:117] "RemoveContainer" containerID="55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.219914 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.219965 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.220033 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.220086 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.220148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.220200 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.221198 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.221238 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72m89\" (UniqueName: \"kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89\") pod \"878180f2-988b-4d66-aaf0-3429900f5e77\" (UID: \"878180f2-988b-4d66-aaf0-3429900f5e77\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.225124 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d850ac81-a29e-4e93-9fab-72b6325de52e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.239045 4939 scope.go:117] "RemoveContainer" containerID="a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.239368 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f\": container with ID starting with a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f not found: ID does not exist" containerID="a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.239396 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f"} err="failed to get container status \"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f\": rpc error: code = NotFound desc = could not find container \"a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f\": container with ID starting with a64283d8252b2f0ceeabc0e8deb6f69f1fd818b92c0cf94b6f3cd244a1ef686f not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.239414 4939 scope.go:117] "RemoveContainer" containerID="55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.241629 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429\": container with ID starting with 55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429 not found: ID does not exist" containerID="55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.241666 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429"} err="failed to get container status \"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429\": rpc error: code = NotFound desc = could not find container \"55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429\": container with ID starting with 55123179fa803eedb64823b90554900b75a543a7ebe1174f8c4ec3d88f450429 not found: ID does not exist" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.249248 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.257686 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89" (OuterVolumeSpecName: "kube-api-access-72m89") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "kube-api-access-72m89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.269740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts" (OuterVolumeSpecName: "scripts") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.272040 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.282392 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.287395 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.304618 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.306792 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.313738 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331356 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331393 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331405 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72m89\" (UniqueName: \"kubernetes.io/projected/878180f2-988b-4d66-aaf0-3429900f5e77-kube-api-access-72m89\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331413 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331422 4939 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.331818 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data" (OuterVolumeSpecName: "config-data") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.379814 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.387616 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "878180f2-988b-4d66-aaf0-3429900f5e77" (UID: "878180f2-988b-4d66-aaf0-3429900f5e77"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.432625 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.432663 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.432675 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/878180f2-988b-4d66-aaf0-3429900f5e77-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.497914 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 is running failed: container process not found" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.498331 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 is running failed: container process not found" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.498711 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 is running failed: container process not found" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.498746 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-mp2sj" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.533850 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.534251 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.534603 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.534633 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.541806 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.543001 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.544855 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:28 crc kubenswrapper[4939]: E0318 16:04:28.544896 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.748566 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mp2sj_1d3941f5-14fb-4ed6-a715-d4b99cb0961c/ovn-controller/0.log" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.748654 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838011 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838120 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838198 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838230 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838264 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdnw5\" (UniqueName: \"kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838300 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838355 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run\") pod \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\" (UID: \"1d3941f5-14fb-4ed6-a715-d4b99cb0961c\") " Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838748 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run" (OuterVolumeSpecName: "var-run") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838810 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.838833 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.839235 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts" (OuterVolumeSpecName: "scripts") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.848424 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5" (OuterVolumeSpecName: "kube-api-access-sdnw5") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "kube-api-access-sdnw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.864909 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.905627 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "1d3941f5-14fb-4ed6-a715-d4b99cb0961c" (UID: "1d3941f5-14fb-4ed6-a715-d4b99cb0961c"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.914487 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939824 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939869 4939 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939882 4939 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939894 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939907 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdnw5\" (UniqueName: \"kubernetes.io/projected/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-kube-api-access-sdnw5\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939916 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:28 crc kubenswrapper[4939]: I0318 16:04:28.939927 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d3941f5-14fb-4ed6-a715-d4b99cb0961c-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.025446 4939 generic.go:334] "Generic (PLEG): container finished" podID="22b02f38-8ae3-4a43-8df6-370521328921" containerID="4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb" exitCode=0 Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.025512 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.025535 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerDied","Data":"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.025592 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22b02f38-8ae3-4a43-8df6-370521328921","Type":"ContainerDied","Data":"17fe9c4329dc0500a1476d7e6d0a87e8b93a83625230903d3cac31c71742ebfa"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.025612 4939 scope.go:117] "RemoveContainer" containerID="4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.032183 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mp2sj_1d3941f5-14fb-4ed6-a715-d4b99cb0961c/ovn-controller/0.log" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.032241 4939 generic.go:334] "Generic (PLEG): container finished" podID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" exitCode=137 Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.032272 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mp2sj" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.032305 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj" event={"ID":"1d3941f5-14fb-4ed6-a715-d4b99cb0961c","Type":"ContainerDied","Data":"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.032384 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mp2sj" event={"ID":"1d3941f5-14fb-4ed6-a715-d4b99cb0961c","Type":"ContainerDied","Data":"dd0d06407d5abc3d9dcf06ed9d0fdbf472198b87c616523ad72d863e2e02762b"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.036673 4939 generic.go:334] "Generic (PLEG): container finished" podID="d5203f87-b63b-45f5-95e3-c536406909e5" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" exitCode=0 Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.036762 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5203f87-b63b-45f5-95e3-c536406909e5","Type":"ContainerDied","Data":"3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040179 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040228 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040370 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxfw\" (UniqueName: \"kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040431 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040448 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040474 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040578 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.040599 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config\") pod \"22b02f38-8ae3-4a43-8df6-370521328921\" (UID: \"22b02f38-8ae3-4a43-8df6-370521328921\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.042359 4939 generic.go:334] "Generic (PLEG): container finished" podID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" exitCode=0 Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.042517 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c2e6985-9642-41e2-8b6f-174c96e86281","Type":"ContainerDied","Data":"1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.044823 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.046145 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.047383 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.047487 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.050464 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw" (OuterVolumeSpecName: "kube-api-access-vcxfw") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "kube-api-access-vcxfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.050819 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c996c794-vrkm4" event={"ID":"878180f2-988b-4d66-aaf0-3429900f5e77","Type":"ContainerDied","Data":"449ec1eecbccac7f81aed70980a52ef6a60a1c555743f4e771718d4a392cead8"} Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.053616 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c996c794-vrkm4" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.061890 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.062893 4939 scope.go:117] "RemoveContainer" containerID="4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.065305 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.091431 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.110425 4939 scope.go:117] "RemoveContainer" containerID="4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb" Mar 18 16:04:29 crc kubenswrapper[4939]: E0318 16:04:29.112620 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb\": container with ID starting with 4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb not found: ID does not exist" containerID="4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.112669 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb"} err="failed to get container status \"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb\": rpc error: code = NotFound desc = could not find container \"4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb\": container with ID starting with 4b58a3f34be233cf8f03c520f4bebe6a993b5cc6b9fd714659b8fdfacdbab4cb not found: ID does not exist" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.112704 4939 scope.go:117] "RemoveContainer" containerID="4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7" Mar 18 16:04:29 crc kubenswrapper[4939]: E0318 16:04:29.113047 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7\": container with ID starting with 4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7 not found: ID does not exist" containerID="4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.113079 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7"} err="failed to get container status \"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7\": rpc error: code = NotFound desc = could not find container \"4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7\": container with ID starting with 4baf0f2012ebef81da7989826b4f328c535da0166e70b55c75c18c3026dd3aa7 not found: ID does not exist" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.113104 4939 scope.go:117] "RemoveContainer" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.115158 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mp2sj"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.124137 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "22b02f38-8ae3-4a43-8df6-370521328921" (UID: "22b02f38-8ae3-4a43-8df6-370521328921"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.124611 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.140926 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-56c996c794-vrkm4"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142002 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142048 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxfw\" (UniqueName: \"kubernetes.io/projected/22b02f38-8ae3-4a43-8df6-370521328921-kube-api-access-vcxfw\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142058 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142069 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142077 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22b02f38-8ae3-4a43-8df6-370521328921-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142085 4939 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142095 4939 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22b02f38-8ae3-4a43-8df6-370521328921-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.142103 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b02f38-8ae3-4a43-8df6-370521328921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.146094 4939 scope.go:117] "RemoveContainer" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" Mar 18 16:04:29 crc kubenswrapper[4939]: E0318 16:04:29.146530 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187\": container with ID starting with cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 not found: ID does not exist" containerID="cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.147217 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187"} err="failed to get container status \"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187\": rpc error: code = NotFound desc = could not find container \"cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187\": container with ID starting with cef41584d17b2ebd8af614dbe4e2e6fbe9ff6fbb9580068144ff9f061005f187 not found: ID does not exist" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.147244 4939 scope.go:117] "RemoveContainer" containerID="add0a3162f0fb4bc567ed074ad66216e38747696a6b5e808eb5932b5e0024e79" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.214254 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.235247 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.247026 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.294372 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.349735 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjb69\" (UniqueName: \"kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69\") pod \"1c2e6985-9642-41e2-8b6f-174c96e86281\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.355418 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69" (OuterVolumeSpecName: "kube-api-access-bjb69") pod "1c2e6985-9642-41e2-8b6f-174c96e86281" (UID: "1c2e6985-9642-41e2-8b6f-174c96e86281"). InnerVolumeSpecName "kube-api-access-bjb69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.417568 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.434640 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451165 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle\") pod \"d5203f87-b63b-45f5-95e3-c536406909e5\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451308 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle\") pod \"1c2e6985-9642-41e2-8b6f-174c96e86281\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451379 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data\") pod \"d5203f87-b63b-45f5-95e3-c536406909e5\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451424 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data\") pod \"1c2e6985-9642-41e2-8b6f-174c96e86281\" (UID: \"1c2e6985-9642-41e2-8b6f-174c96e86281\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451455 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ttw\" (UniqueName: \"kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw\") pod \"d5203f87-b63b-45f5-95e3-c536406909e5\" (UID: \"d5203f87-b63b-45f5-95e3-c536406909e5\") " Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.451760 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjb69\" (UniqueName: \"kubernetes.io/projected/1c2e6985-9642-41e2-8b6f-174c96e86281-kube-api-access-bjb69\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.456041 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw" (OuterVolumeSpecName: "kube-api-access-22ttw") pod "d5203f87-b63b-45f5-95e3-c536406909e5" (UID: "d5203f87-b63b-45f5-95e3-c536406909e5"). InnerVolumeSpecName "kube-api-access-22ttw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.475765 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data" (OuterVolumeSpecName: "config-data") pod "1c2e6985-9642-41e2-8b6f-174c96e86281" (UID: "1c2e6985-9642-41e2-8b6f-174c96e86281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.475993 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5203f87-b63b-45f5-95e3-c536406909e5" (UID: "d5203f87-b63b-45f5-95e3-c536406909e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.487206 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c2e6985-9642-41e2-8b6f-174c96e86281" (UID: "1c2e6985-9642-41e2-8b6f-174c96e86281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.492338 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data" (OuterVolumeSpecName: "config-data") pod "d5203f87-b63b-45f5-95e3-c536406909e5" (UID: "d5203f87-b63b-45f5-95e3-c536406909e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.558318 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.558354 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.558363 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2e6985-9642-41e2-8b6f-174c96e86281-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.558372 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22ttw\" (UniqueName: \"kubernetes.io/projected/d5203f87-b63b-45f5-95e3-c536406909e5-kube-api-access-22ttw\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:29 crc kubenswrapper[4939]: I0318 16:04:29.558383 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5203f87-b63b-45f5-95e3-c536406909e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.062669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c2e6985-9642-41e2-8b6f-174c96e86281","Type":"ContainerDied","Data":"16459dd59964fcc95cb07e9c30bfc0b5edaaa956338ec60ee1fd29f25e3033b3"} Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.062743 4939 scope.go:117] "RemoveContainer" containerID="1e15a5f931fc89a84716f5b2c41190b5a8279e5905befc6f55a0dc358b27b59a" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.062899 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.085161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5203f87-b63b-45f5-95e3-c536406909e5","Type":"ContainerDied","Data":"f1f40ddab2f339abefe47bad5936e2cbdc133521e8c923299b22d6ab90193ebc"} Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.085280 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.094711 4939 scope.go:117] "RemoveContainer" containerID="3e070327878c8550317e365a55c78b0c2c3b85a0e97e7cafafa527795e01960b" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.123555 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.141773 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" path="/var/lib/kubelet/pods/1d3941f5-14fb-4ed6-a715-d4b99cb0961c/volumes" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.142818 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b02f38-8ae3-4a43-8df6-370521328921" path="/var/lib/kubelet/pods/22b02f38-8ae3-4a43-8df6-370521328921/volumes" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.143653 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" path="/var/lib/kubelet/pods/26f60b5c-7d32-4fea-b3ca-a8132f3ed026/volumes" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.144899 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" path="/var/lib/kubelet/pods/878180f2-988b-4d66-aaf0-3429900f5e77/volumes" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.145610 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" path="/var/lib/kubelet/pods/d850ac81-a29e-4e93-9fab-72b6325de52e/volumes" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.146635 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.146673 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.146688 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.256523 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.183:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:04:30 crc kubenswrapper[4939]: I0318 16:04:30.536296 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5d7467855-ml675" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.162:9696/\": dial tcp 10.217.0.162:9696: connect: connection refused" Mar 18 16:04:32 crc kubenswrapper[4939]: I0318 16:04:32.154207 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" path="/var/lib/kubelet/pods/1c2e6985-9642-41e2-8b6f-174c96e86281/volumes" Mar 18 16:04:32 crc kubenswrapper[4939]: I0318 16:04:32.155310 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" path="/var/lib/kubelet/pods/d5203f87-b63b-45f5-95e3-c536406909e5/volumes" Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.528805 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.529452 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.529807 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.529835 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.530003 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.531233 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.533427 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:33 crc kubenswrapper[4939]: E0318 16:04:33.533604 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.677689 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770116 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770246 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wncrh\" (UniqueName: \"kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770281 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770311 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770329 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.770360 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config\") pod \"ecde231d-a07e-4f59-81bb-fc4608e906ea\" (UID: \"ecde231d-a07e-4f59-81bb-fc4608e906ea\") " Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.775580 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh" (OuterVolumeSpecName: "kube-api-access-wncrh") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "kube-api-access-wncrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.776742 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.810167 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config" (OuterVolumeSpecName: "config") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.822245 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.826256 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.828398 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.838713 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ecde231d-a07e-4f59-81bb-fc4608e906ea" (UID: "ecde231d-a07e-4f59-81bb-fc4608e906ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878551 4939 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878612 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878628 4939 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878640 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878648 4939 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878657 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde231d-a07e-4f59-81bb-fc4608e906ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:35 crc kubenswrapper[4939]: I0318 16:04:35.878686 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wncrh\" (UniqueName: \"kubernetes.io/projected/ecde231d-a07e-4f59-81bb-fc4608e906ea-kube-api-access-wncrh\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.167325 4939 generic.go:334] "Generic (PLEG): container finished" podID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerID="57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2" exitCode=0 Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.167369 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerDied","Data":"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2"} Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.167403 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d7467855-ml675" event={"ID":"ecde231d-a07e-4f59-81bb-fc4608e906ea","Type":"ContainerDied","Data":"4be05dd2b01488168768f6e64a76d246c025e9debcc6432323ddeee41a5cbd61"} Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.167424 4939 scope.go:117] "RemoveContainer" containerID="e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.167489 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d7467855-ml675" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.196667 4939 scope.go:117] "RemoveContainer" containerID="57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.215836 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.221721 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d7467855-ml675"] Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.227296 4939 scope.go:117] "RemoveContainer" containerID="e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa" Mar 18 16:04:36 crc kubenswrapper[4939]: E0318 16:04:36.228946 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa\": container with ID starting with e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa not found: ID does not exist" containerID="e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.228994 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa"} err="failed to get container status \"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa\": rpc error: code = NotFound desc = could not find container \"e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa\": container with ID starting with e1e363d2a65e145f5344081962982246c1a76ac9d8b3e7601530d819f26f6efa not found: ID does not exist" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.229024 4939 scope.go:117] "RemoveContainer" containerID="57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2" Mar 18 16:04:36 crc kubenswrapper[4939]: E0318 16:04:36.229430 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2\": container with ID starting with 57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2 not found: ID does not exist" containerID="57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2" Mar 18 16:04:36 crc kubenswrapper[4939]: I0318 16:04:36.229559 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2"} err="failed to get container status \"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2\": rpc error: code = NotFound desc = could not find container \"57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2\": container with ID starting with 57164b458989d4d8244ddafc97ba7252d707155b53164da55ab4e329d7fc49b2 not found: ID does not exist" Mar 18 16:04:38 crc kubenswrapper[4939]: I0318 16:04:38.141497 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" path="/var/lib/kubelet/pods/ecde231d-a07e-4f59-81bb-fc4608e906ea/volumes" Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.528711 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.529066 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.529419 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.529498 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.530583 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.531836 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.533052 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:38 crc kubenswrapper[4939]: E0318 16:04:38.533085 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.528116 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.528391 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.528857 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.528887 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.531164 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.532629 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.534308 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:43 crc kubenswrapper[4939]: E0318 16:04:43.534365 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.529042 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.529879 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.530342 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.530371 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.530394 4939 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.531749 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.532881 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 16:04:48 crc kubenswrapper[4939]: E0318 16:04:48.532927 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-56pdq" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.314141 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-56pdq_52f68996-05bc-4432-ac98-c730b09c6288/ovs-vswitchd/0.log" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.330354 4939 generic.go:334] "Generic (PLEG): container finished" podID="52f68996-05bc-4432-ac98-c730b09c6288" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" exitCode=137 Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.330411 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerDied","Data":"8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c"} Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.337210 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerID="e9186d27f8cbffc2e014f48debc8242aa5db203078eb8960a71ebcd03018d88d" exitCode=137 Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.337276 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"e9186d27f8cbffc2e014f48debc8242aa5db203078eb8960a71ebcd03018d88d"} Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.337305 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ee0f94f0-d475-4921-9d83-357a8e436f33","Type":"ContainerDied","Data":"1cf432016526d7d1c3c49d62eaf38e5832ce82056b8358cb26670f44db469f5d"} Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.337315 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf432016526d7d1c3c49d62eaf38e5832ce82056b8358cb26670f44db469f5d" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.340407 4939 generic.go:334] "Generic (PLEG): container finished" podID="f757e65c-c660-4614-bb43-38b9beb092e9" containerID="abca7abb8af59dd80a723a07cc0e9834ccc7d6891529372d20c592078a41f166" exitCode=137 Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.340451 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerDied","Data":"abca7abb8af59dd80a723a07cc0e9834ccc7d6891529372d20c592078a41f166"} Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.357392 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.493177 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499011 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499053 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ksvb\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499126 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499204 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499245 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache\") pod \"ee0f94f0-d475-4921-9d83-357a8e436f33\" (UID: \"ee0f94f0-d475-4921-9d83-357a8e436f33\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499790 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock" (OuterVolumeSpecName: "lock") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.499838 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache" (OuterVolumeSpecName: "cache") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.511360 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.511418 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb" (OuterVolumeSpecName: "kube-api-access-6ksvb") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "kube-api-access-6ksvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.511550 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601585 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601687 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601736 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601781 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.601859 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f75kh\" (UniqueName: \"kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh\") pod \"f757e65c-c660-4614-bb43-38b9beb092e9\" (UID: \"f757e65c-c660-4614-bb43-38b9beb092e9\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602108 4939 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-lock\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602119 4939 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602128 4939 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ee0f94f0-d475-4921-9d83-357a8e436f33-cache\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602146 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602154 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ksvb\" (UniqueName: \"kubernetes.io/projected/ee0f94f0-d475-4921-9d83-357a8e436f33-kube-api-access-6ksvb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.602650 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.605974 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.607238 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh" (OuterVolumeSpecName: "kube-api-access-f75kh") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "kube-api-access-f75kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.615604 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-56pdq_52f68996-05bc-4432-ac98-c730b09c6288/ovs-vswitchd/0.log" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.616791 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.618456 4939 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.620751 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts" (OuterVolumeSpecName: "scripts") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.658684 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.683294 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data" (OuterVolumeSpecName: "config-data") pod "f757e65c-c660-4614-bb43-38b9beb092e9" (UID: "f757e65c-c660-4614-bb43-38b9beb092e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703923 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703954 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703965 4939 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703974 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f75kh\" (UniqueName: \"kubernetes.io/projected/f757e65c-c660-4614-bb43-38b9beb092e9-kube-api-access-f75kh\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703983 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703991 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f757e65c-c660-4614-bb43-38b9beb092e9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.703999 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f757e65c-c660-4614-bb43-38b9beb092e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.771061 4939 scope.go:117] "RemoveContainer" containerID="10155fbad3a8bff2710db0c1b47a04c483e40e091c2f57c64500ef9c874cb014" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.780883 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee0f94f0-d475-4921-9d83-357a8e436f33" (UID: "ee0f94f0-d475-4921-9d83-357a8e436f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.805388 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.805855 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.805949 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806082 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm5vn\" (UniqueName: \"kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806191 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806275 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib\") pod \"52f68996-05bc-4432-ac98-c730b09c6288\" (UID: \"52f68996-05bc-4432-ac98-c730b09c6288\") " Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.805944 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.805972 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run" (OuterVolumeSpecName: "var-run") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806292 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log" (OuterVolumeSpecName: "var-log") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806435 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib" (OuterVolumeSpecName: "var-lib") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806730 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts" (OuterVolumeSpecName: "scripts") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806860 4939 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806924 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.806979 4939 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.807116 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0f94f0-d475-4921-9d83-357a8e436f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.807180 4939 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/52f68996-05bc-4432-ac98-c730b09c6288-var-lib\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.808444 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn" (OuterVolumeSpecName: "kube-api-access-nm5vn") pod "52f68996-05bc-4432-ac98-c730b09c6288" (UID: "52f68996-05bc-4432-ac98-c730b09c6288"). InnerVolumeSpecName "kube-api-access-nm5vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.911342 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm5vn\" (UniqueName: \"kubernetes.io/projected/52f68996-05bc-4432-ac98-c730b09c6288-kube-api-access-nm5vn\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:50 crc kubenswrapper[4939]: I0318 16:04:50.911400 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f68996-05bc-4432-ac98-c730b09c6288-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.356107 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-56pdq_52f68996-05bc-4432-ac98-c730b09c6288/ovs-vswitchd/0.log" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.357163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-56pdq" event={"ID":"52f68996-05bc-4432-ac98-c730b09c6288","Type":"ContainerDied","Data":"488381bd6e03e3e2e4c448211b41f2111b278f38e3ca8a61d769d6a6cf95a0f6"} Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.357226 4939 scope.go:117] "RemoveContainer" containerID="8e6194fe7eab02868746bb5b1080f2dc599f2c24a14d578c14da6eb7f7709b2c" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.357340 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-56pdq" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.362278 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.362271 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f757e65c-c660-4614-bb43-38b9beb092e9","Type":"ContainerDied","Data":"52fb719105824bb08514ff4a0a0b03b169d0157166d7cbf6420c05cfbd2025c6"} Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.362321 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.397664 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.407567 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-56pdq"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.413487 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.423556 4939 scope.go:117] "RemoveContainer" containerID="de241d56b9f2ff39d7f2599967d4ad1fdb398b5cc997be0e3e1f04193a90376a" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.429304 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.435485 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.441067 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.448614 4939 scope.go:117] "RemoveContainer" containerID="d8c4a29ac08df81c707c9e1ea61ddab446599268aa2b11f172231c3581a36abf" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.477904 4939 scope.go:117] "RemoveContainer" containerID="c87e18d10fc916c7b05bb350ccbb835b08683349a3ae8b07119f660954350f76" Mar 18 16:04:51 crc kubenswrapper[4939]: I0318 16:04:51.498520 4939 scope.go:117] "RemoveContainer" containerID="abca7abb8af59dd80a723a07cc0e9834ccc7d6891529372d20c592078a41f166" Mar 18 16:04:52 crc kubenswrapper[4939]: I0318 16:04:52.148108 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f68996-05bc-4432-ac98-c730b09c6288" path="/var/lib/kubelet/pods/52f68996-05bc-4432-ac98-c730b09c6288/volumes" Mar 18 16:04:52 crc kubenswrapper[4939]: I0318 16:04:52.150092 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" path="/var/lib/kubelet/pods/ee0f94f0-d475-4921-9d83-357a8e436f33/volumes" Mar 18 16:04:52 crc kubenswrapper[4939]: I0318 16:04:52.154205 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" path="/var/lib/kubelet/pods/f757e65c-c660-4614-bb43-38b9beb092e9/volumes" Mar 18 16:04:53 crc kubenswrapper[4939]: I0318 16:04:53.687923 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:04:53 crc kubenswrapper[4939]: I0318 16:04:53.687991 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:05:23 crc kubenswrapper[4939]: I0318 16:05:23.687813 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:05:23 crc kubenswrapper[4939]: I0318 16:05:23.688290 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:05:23 crc kubenswrapper[4939]: I0318 16:05:23.688356 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:05:23 crc kubenswrapper[4939]: I0318 16:05:23.689825 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:05:23 crc kubenswrapper[4939]: I0318 16:05:23.689902 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" gracePeriod=600 Mar 18 16:05:24 crc kubenswrapper[4939]: E0318 16:05:24.376700 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:05:24 crc kubenswrapper[4939]: I0318 16:05:24.649301 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" exitCode=0 Mar 18 16:05:24 crc kubenswrapper[4939]: I0318 16:05:24.649361 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd"} Mar 18 16:05:24 crc kubenswrapper[4939]: I0318 16:05:24.649406 4939 scope.go:117] "RemoveContainer" containerID="4fff46d36c1bddb51dfc726a35d19e477c47083cb08f0289fbf97c6d0f0baa61" Mar 18 16:05:24 crc kubenswrapper[4939]: I0318 16:05:24.650191 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:05:24 crc kubenswrapper[4939]: E0318 16:05:24.650745 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:05:38 crc kubenswrapper[4939]: I0318 16:05:38.133947 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:05:38 crc kubenswrapper[4939]: E0318 16:05:38.134822 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.132433 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:05:51 crc kubenswrapper[4939]: E0318 16:05:51.133293 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.536762 4939 scope.go:117] "RemoveContainer" containerID="e42031844990c3cfe18c36afa3752df80cd853511882f8b3d46df4d8caa9ab69" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.585195 4939 scope.go:117] "RemoveContainer" containerID="5a028acbdf9be78d30ad8dac81f37e1a2ea94ee44f9dbb1a9a9f4685f2f7d586" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.606827 4939 scope.go:117] "RemoveContainer" containerID="678d60bf8a444beedc86a2d299dd661f7fae4101a43640068f26d3d469a7b2aa" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.634249 4939 scope.go:117] "RemoveContainer" containerID="4f93c5de887966d79cfc98a6c6b42da2198b27db1996aa7a8e956fa8ad27c205" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.661488 4939 scope.go:117] "RemoveContainer" containerID="e0e971a7aa95c06696e32b2e72dbdcc749d30e39e71cbf719fef8099c56f0502" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.698415 4939 scope.go:117] "RemoveContainer" containerID="2fb46578d3eb85ceda48ac95f0b4ee5515773eab56143cabd3f30c0f08b2042e" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.722325 4939 scope.go:117] "RemoveContainer" containerID="79931f9e3df7c056e097bbb2e2a50a13aac8f8b97f4469add924017b9d7aecda" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.746457 4939 scope.go:117] "RemoveContainer" containerID="a657786e9cd27e7663fd8b3a78dd98fccda900df892a6912244ea73650607bf3" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.771178 4939 scope.go:117] "RemoveContainer" containerID="d54c7d302d6af9bea4dac8164b0ce249c3aa366a90bdd01e9a0627f62c76b69d" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.794688 4939 scope.go:117] "RemoveContainer" containerID="5c740b8f48083a205dbc78e36790fcc31d2e6d6d2bd6b7443972daeffc991a8d" Mar 18 16:05:51 crc kubenswrapper[4939]: I0318 16:05:51.812552 4939 scope.go:117] "RemoveContainer" containerID="91a292f6fe7a26c5c29f8f381a1c70bc1a4b7445389fcb1c53cbb76f807c045d" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.151345 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564166-vfvwz"] Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.151892 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.151907 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.151920 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-metadata" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.151928 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-metadata" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.151949 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-expirer" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.151957 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-expirer" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.151975 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.151983 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-server" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.151997 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152005 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152019 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152028 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152038 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="mysql-bootstrap" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152046 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="mysql-bootstrap" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152060 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="setup-container" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152068 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="setup-container" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152078 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="probe" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152085 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="probe" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152101 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-notification-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152109 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-notification-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152119 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152128 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-server" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152139 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-reaper" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152146 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-reaper" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152157 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152165 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152178 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152186 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152197 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="sg-core" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152206 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="sg-core" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152220 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152228 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152242 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="ovn-northd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152250 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="ovn-northd" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152261 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152270 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152278 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152286 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152299 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152308 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152318 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152325 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152339 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152347 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152357 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152364 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152378 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="galera" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152387 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="galera" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152398 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="proxy-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152406 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="proxy-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152420 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152428 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152443 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="rsync" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152451 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="rsync" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152460 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152470 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152480 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152488 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152524 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152533 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152547 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152555 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152566 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152574 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152588 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152596 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152607 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152614 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152626 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152634 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152643 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152651 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152661 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152669 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152685 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server-init" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152693 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server-init" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152704 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152711 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152723 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152731 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-server" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152743 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152751 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152761 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-central-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152770 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-central-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152782 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="swift-recon-cron" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152790 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="swift-recon-cron" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152799 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" containerName="keystone-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152807 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" containerName="keystone-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152816 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="openstack-network-exporter" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152824 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="openstack-network-exporter" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152836 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="cinder-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152844 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="cinder-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152854 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152862 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152879 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152913 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152923 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" containerName="memcached" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152946 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" containerName="memcached" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152955 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152963 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152975 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.152983 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.152996 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="setup-container" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153005 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="setup-container" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153019 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153028 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153037 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153045 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153059 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153068 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153086 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153099 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153124 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" containerName="kube-state-metrics" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153135 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" containerName="kube-state-metrics" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153153 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153166 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: E0318 16:06:00.153176 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153187 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153430 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3941f5-14fb-4ed6-a715-d4b99cb0961c" containerName="ovn-controller" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153452 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153465 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153476 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153488 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153523 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153534 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-notification-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153544 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153559 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="openstack-network-exporter" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153575 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d850ac81-a29e-4e93-9fab-72b6325de52e" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153583 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-reaper" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153595 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153606 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153637 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="rsync" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153650 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153660 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-updater" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153671 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="sg-core" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153684 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153696 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153711 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153724 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153734 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovsdb-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153744 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="container-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153752 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecde231d-a07e-4f59-81bb-fc4608e906ea" containerName="neutron-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153763 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="object-expirer" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153772 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce6c3d1-94f3-43ac-ba9b-5052ffa977f2" containerName="cinder-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153785 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153796 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="50baf265-a6d8-445d-aed6-853781644d9e" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153814 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153832 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f68996-05bc-4432-ac98-c730b09c6288" containerName="ovs-vswitchd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153844 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="probe" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153856 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f60b5c-7d32-4fea-b3ca-a8132f3ed026" containerName="rabbitmq" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153866 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153886 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b567b0-935e-46f6-8cf7-3c8a9040bad4" containerName="kube-state-metrics" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153898 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-replicator" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153912 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-auditor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153921 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="878180f2-988b-4d66-aaf0-3429900f5e77" containerName="keystone-api" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153929 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7cba1f-8d56-47c9-8016-3184a1374386" containerName="placement-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153941 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="account-server" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153950 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d02491-90d4-41b4-884d-0959feb366b0" containerName="nova-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153962 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153970 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f757e65c-c660-4614-bb43-38b9beb092e9" containerName="cinder-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153981 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="658fa1ab-1e7e-42d2-947e-6c74215e15f0" containerName="memcached" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.153994 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="proxy-httpd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154001 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0f94f0-d475-4921-9d83-357a8e436f33" containerName="swift-recon-cron" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154014 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5203f87-b63b-45f5-95e3-c536406909e5" containerName="nova-scheduler-scheduler" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154026 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="86474b5e-6fc8-4810-a083-699878062ade" containerName="barbican-worker" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154035 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c83b398-2fa8-4862-a2fe-6f66e3200216" containerName="barbican-keystone-listener" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154045 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b02f38-8ae3-4a43-8df6-370521328921" containerName="galera" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154055 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d7c3f-1c09-4bbd-8de1-df304376c198" containerName="ceilometer-central-agent" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154068 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2e6985-9642-41e2-8b6f-174c96e86281" containerName="nova-cell1-conductor-conductor" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154079 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71cb7a9-1ab5-4596-901f-314dcfae2bc4" containerName="nova-metadata-metadata" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154091 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb92b15e-a854-4505-97e2-37e4a7b821b4" containerName="glance-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154105 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="18740c60-7bc8-4daa-a426-1aa624b7ac8a" containerName="ovn-northd" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154118 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a70df8-1617-448d-9495-5aa55d8b97fb" containerName="barbican-api-log" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.154705 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.158237 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.158267 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.159344 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-vfvwz"] Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.159527 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf2wb\" (UniqueName: \"kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb\") pod \"auto-csr-approver-29564166-vfvwz\" (UID: \"3905c518-e10e-4e70-b6f7-c83165de3204\") " pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.160814 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.260575 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf2wb\" (UniqueName: \"kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb\") pod \"auto-csr-approver-29564166-vfvwz\" (UID: \"3905c518-e10e-4e70-b6f7-c83165de3204\") " pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.281750 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf2wb\" (UniqueName: \"kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb\") pod \"auto-csr-approver-29564166-vfvwz\" (UID: \"3905c518-e10e-4e70-b6f7-c83165de3204\") " pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.481797 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:00 crc kubenswrapper[4939]: I0318 16:06:00.902977 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-vfvwz"] Mar 18 16:06:01 crc kubenswrapper[4939]: I0318 16:06:01.038398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" event={"ID":"3905c518-e10e-4e70-b6f7-c83165de3204","Type":"ContainerStarted","Data":"dbeebe3358d5c2917a539a9c02516d424173e060deb618e1c0750360ce18b8a1"} Mar 18 16:06:02 crc kubenswrapper[4939]: I0318 16:06:02.133242 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:06:02 crc kubenswrapper[4939]: E0318 16:06:02.133765 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:06:03 crc kubenswrapper[4939]: I0318 16:06:03.055338 4939 generic.go:334] "Generic (PLEG): container finished" podID="3905c518-e10e-4e70-b6f7-c83165de3204" containerID="dcd6ef28fe896422064f4f9596a864968af451ec67660b6e712ff3edb94f6d41" exitCode=0 Mar 18 16:06:03 crc kubenswrapper[4939]: I0318 16:06:03.055390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" event={"ID":"3905c518-e10e-4e70-b6f7-c83165de3204","Type":"ContainerDied","Data":"dcd6ef28fe896422064f4f9596a864968af451ec67660b6e712ff3edb94f6d41"} Mar 18 16:06:04 crc kubenswrapper[4939]: I0318 16:06:04.312596 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:04 crc kubenswrapper[4939]: I0318 16:06:04.419417 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf2wb\" (UniqueName: \"kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb\") pod \"3905c518-e10e-4e70-b6f7-c83165de3204\" (UID: \"3905c518-e10e-4e70-b6f7-c83165de3204\") " Mar 18 16:06:04 crc kubenswrapper[4939]: I0318 16:06:04.423855 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb" (OuterVolumeSpecName: "kube-api-access-jf2wb") pod "3905c518-e10e-4e70-b6f7-c83165de3204" (UID: "3905c518-e10e-4e70-b6f7-c83165de3204"). InnerVolumeSpecName "kube-api-access-jf2wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:06:04 crc kubenswrapper[4939]: I0318 16:06:04.521951 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf2wb\" (UniqueName: \"kubernetes.io/projected/3905c518-e10e-4e70-b6f7-c83165de3204-kube-api-access-jf2wb\") on node \"crc\" DevicePath \"\"" Mar 18 16:06:05 crc kubenswrapper[4939]: I0318 16:06:05.079714 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" event={"ID":"3905c518-e10e-4e70-b6f7-c83165de3204","Type":"ContainerDied","Data":"dbeebe3358d5c2917a539a9c02516d424173e060deb618e1c0750360ce18b8a1"} Mar 18 16:06:05 crc kubenswrapper[4939]: I0318 16:06:05.079763 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbeebe3358d5c2917a539a9c02516d424173e060deb618e1c0750360ce18b8a1" Mar 18 16:06:05 crc kubenswrapper[4939]: I0318 16:06:05.079806 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-vfvwz" Mar 18 16:06:05 crc kubenswrapper[4939]: I0318 16:06:05.395766 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-k2bl4"] Mar 18 16:06:05 crc kubenswrapper[4939]: I0318 16:06:05.401994 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-k2bl4"] Mar 18 16:06:06 crc kubenswrapper[4939]: I0318 16:06:06.141385 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f623300-216b-4e06-88bc-9e7443e5bd62" path="/var/lib/kubelet/pods/9f623300-216b-4e06-88bc-9e7443e5bd62/volumes" Mar 18 16:06:15 crc kubenswrapper[4939]: I0318 16:06:15.133884 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:06:15 crc kubenswrapper[4939]: E0318 16:06:15.134744 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:06:28 crc kubenswrapper[4939]: I0318 16:06:28.133776 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:06:28 crc kubenswrapper[4939]: E0318 16:06:28.134790 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:06:41 crc kubenswrapper[4939]: I0318 16:06:41.133680 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:06:41 crc kubenswrapper[4939]: E0318 16:06:41.134398 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:06:51 crc kubenswrapper[4939]: I0318 16:06:51.958983 4939 scope.go:117] "RemoveContainer" containerID="3cc0ce0b489cf01df6bd48ac72f09591afcecee614c2b58e8147daf7f614c2d1" Mar 18 16:06:51 crc kubenswrapper[4939]: I0318 16:06:51.999748 4939 scope.go:117] "RemoveContainer" containerID="66593bb867c511d8f535e9c33ce158533f67eeac1a915004bfc396a9f8eaf965" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.017953 4939 scope.go:117] "RemoveContainer" containerID="e9186d27f8cbffc2e014f48debc8242aa5db203078eb8960a71ebcd03018d88d" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.040178 4939 scope.go:117] "RemoveContainer" containerID="49a4e678ca0aa2dc3a78eb3ed7a1fd937781bddd10e5dd6b8eefa87915967bf0" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.059624 4939 scope.go:117] "RemoveContainer" containerID="eb4726099480d94bfcadfcc8ac5e8c7e0a22e0445d91b54b8bd947be6096f473" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.076656 4939 scope.go:117] "RemoveContainer" containerID="3a58bac812ff61d927ca864d82d746ab03ccf821635ee32714d7731a006b6367" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.091907 4939 scope.go:117] "RemoveContainer" containerID="ece5c49283a682d6e2a0125d0eab1ffd911ea301004194e9df8bdab25de3e56c" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.107690 4939 scope.go:117] "RemoveContainer" containerID="e1e5c6f6321fc70ca2c3f675371229a1d6d737697c3f271c25173d6a19cad769" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.151322 4939 scope.go:117] "RemoveContainer" containerID="6a1240c4afc05c1dc8d7f58258c3a1656a859c201e52d3ef0f5ba0aaf40645bd" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.176051 4939 scope.go:117] "RemoveContainer" containerID="c84b03121f75f4cf62c653101d35986e013f48bafadd206d506406493b8e9c90" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.195607 4939 scope.go:117] "RemoveContainer" containerID="d2f402963aa9055e0082d68e353fb1235dab137baf36c27cc3d912fdc911ff21" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.219419 4939 scope.go:117] "RemoveContainer" containerID="e5f58622c9811a365c2a3af21b7ebad75c3439fdc7d6e01b133e9a79046b8726" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.243543 4939 scope.go:117] "RemoveContainer" containerID="5c713ae91799f792040bf3d961ee93903e389af2820041b641974aeae0138dc0" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.258783 4939 scope.go:117] "RemoveContainer" containerID="c4f05427786ea0127178a8ea6b647e8626d443308882a093c4c79509731e6abf" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.291402 4939 scope.go:117] "RemoveContainer" containerID="57c5515f8a5b3530a17121b168d8678bb05de6d5ca4a2d702308291f1fcd2d80" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.311921 4939 scope.go:117] "RemoveContainer" containerID="4e7870f3e02bd7c6cf97984679ad07a4746c1bbcf22555811321ad8627e7b595" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.339086 4939 scope.go:117] "RemoveContainer" containerID="43de3851a996ae5fb148b392f668b55f5e52a20759062bbd85dc2119439767c6" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.355279 4939 scope.go:117] "RemoveContainer" containerID="f258992948b744aaf9d67e3c6a706143ae30356b748f5de908345ee552ac4c49" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.374922 4939 scope.go:117] "RemoveContainer" containerID="81945648d6ccfe19dbfc6da6c8d0e335483a55dcd4b1766b29052a5ed772cad1" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.394330 4939 scope.go:117] "RemoveContainer" containerID="8577f19335a709c20a6140281f93e100fa6126302c44206e5de996f40595a70b" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.412002 4939 scope.go:117] "RemoveContainer" containerID="ee5d5f5a29d0ff5499b96b3e87728787466fe48d24d60fe72721d0ff3f3c1528" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.433411 4939 scope.go:117] "RemoveContainer" containerID="17b37072545ad8d250412a3ed598381f9883f45412fd6cc5eb64b9e9b471819c" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.449053 4939 scope.go:117] "RemoveContainer" containerID="d04e69f4c87e17dbd5c2a71159726ed1fbba2fdd619b794a36df627a482cb171" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.465832 4939 scope.go:117] "RemoveContainer" containerID="f363212ee4f25a7395df3d0d667028ad88acc708868cd9d1bc2c2e84543c3dda" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.484894 4939 scope.go:117] "RemoveContainer" containerID="6104ab35790b188976b122aece4eff76014389afebabf2beccd24e6059569113" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.504325 4939 scope.go:117] "RemoveContainer" containerID="41b415a6382e1cdc649dd7cd5f4fb88187f2b722a477e519dd3fb62d57dedb80" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.520761 4939 scope.go:117] "RemoveContainer" containerID="6516dfe8b15001572d08172fcd22e95cd68f78a081a555cea105c5a94f42e2b7" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.541905 4939 scope.go:117] "RemoveContainer" containerID="39da98db91664b5c7f2fea44fad6849e5c9b51f8e829793975aadf713831a221" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.571677 4939 scope.go:117] "RemoveContainer" containerID="18de2267ef895322ef9b35671a21963af69c81f3a74b753b0ee5412e9a974231" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.638598 4939 scope.go:117] "RemoveContainer" containerID="907270ca149a5b20bb690bdfd59ee524001eefc89ea51cbcea166053f5e651ac" Mar 18 16:06:52 crc kubenswrapper[4939]: I0318 16:06:52.661586 4939 scope.go:117] "RemoveContainer" containerID="e808f3043f3658457d8c56cf00be5a6d8cb36f7c5acd13d041c82f6da663fc56" Mar 18 16:06:56 crc kubenswrapper[4939]: I0318 16:06:56.136402 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:06:56 crc kubenswrapper[4939]: E0318 16:06:56.136883 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:07:11 crc kubenswrapper[4939]: I0318 16:07:11.133131 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:07:11 crc kubenswrapper[4939]: E0318 16:07:11.133747 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:07:25 crc kubenswrapper[4939]: I0318 16:07:25.133458 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:07:25 crc kubenswrapper[4939]: E0318 16:07:25.134174 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:07:37 crc kubenswrapper[4939]: I0318 16:07:37.133251 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:07:37 crc kubenswrapper[4939]: E0318 16:07:37.133969 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:07:51 crc kubenswrapper[4939]: I0318 16:07:51.133154 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:07:51 crc kubenswrapper[4939]: E0318 16:07:51.134025 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:07:52 crc kubenswrapper[4939]: I0318 16:07:52.950968 4939 scope.go:117] "RemoveContainer" containerID="54f847d820c4484409bda5c3b12fbb96a6bda9d2d952f1964c605f5a84ef467a" Mar 18 16:07:52 crc kubenswrapper[4939]: I0318 16:07:52.970404 4939 scope.go:117] "RemoveContainer" containerID="84f3f6c9cdd2155af79eaa3c0c011995f8be1962da95118acf91db1cd01c6852" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.003222 4939 scope.go:117] "RemoveContainer" containerID="209907d68aeaadea75ff4baa4dcf0cf4485c21079c579735529d0faeecba1e43" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.027252 4939 scope.go:117] "RemoveContainer" containerID="d1bbd88198e7c27fc2ae103dd4491725bc76a76487770c0862669d3280d0117a" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.054709 4939 scope.go:117] "RemoveContainer" containerID="fb26664c79478bc866dc4d04f4eafcb037e12fdf6ea1349c4fae4d7c7ec04ee6" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.117396 4939 scope.go:117] "RemoveContainer" containerID="08f1488f1748f3e79870960db845efd33610723a7e30d14a11fe46d3bd8c9767" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.156682 4939 scope.go:117] "RemoveContainer" containerID="703e1a3bacf7c889b4eaa030926ad758deed0401540052152182a6de21a167bc" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.193429 4939 scope.go:117] "RemoveContainer" containerID="6b8e2499c7541a0f9cf78895b03b9389a427404109be620d47865829c7a89f85" Mar 18 16:07:53 crc kubenswrapper[4939]: I0318 16:07:53.247693 4939 scope.go:117] "RemoveContainer" containerID="02c119c7d1b12600535f978f2ef210454ef30c1c70e312e01eea669577780dc3" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.166815 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564168-b6rd4"] Mar 18 16:08:00 crc kubenswrapper[4939]: E0318 16:08:00.182338 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.182378 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:08:00 crc kubenswrapper[4939]: E0318 16:08:00.182438 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3905c518-e10e-4e70-b6f7-c83165de3204" containerName="oc" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.182448 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3905c518-e10e-4e70-b6f7-c83165de3204" containerName="oc" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.184640 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8269f4a0-d0d4-4620-9c3e-885d453b7109" containerName="mariadb-account-create-update" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.184707 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3905c518-e10e-4e70-b6f7-c83165de3204" containerName="oc" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.185256 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-b6rd4"] Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.185344 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.191809 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.191981 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.192099 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.334139 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlz49\" (UniqueName: \"kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49\") pod \"auto-csr-approver-29564168-b6rd4\" (UID: \"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136\") " pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.435451 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlz49\" (UniqueName: \"kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49\") pod \"auto-csr-approver-29564168-b6rd4\" (UID: \"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136\") " pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.472893 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlz49\" (UniqueName: \"kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49\") pod \"auto-csr-approver-29564168-b6rd4\" (UID: \"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136\") " pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.503012 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.933289 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-b6rd4"] Mar 18 16:08:00 crc kubenswrapper[4939]: I0318 16:08:00.939213 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:08:01 crc kubenswrapper[4939]: I0318 16:08:01.103298 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" event={"ID":"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136","Type":"ContainerStarted","Data":"f98392425838c08ff05405ac577dfba11841f7549c039fbe20bc9f0ecc6befa0"} Mar 18 16:08:03 crc kubenswrapper[4939]: I0318 16:08:03.121248 4939 generic.go:334] "Generic (PLEG): container finished" podID="fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" containerID="37ddddb18f6e38bc19d18b8fa4143aec1c152c4a4a029d551a965aef21fb6822" exitCode=0 Mar 18 16:08:03 crc kubenswrapper[4939]: I0318 16:08:03.121297 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" event={"ID":"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136","Type":"ContainerDied","Data":"37ddddb18f6e38bc19d18b8fa4143aec1c152c4a4a029d551a965aef21fb6822"} Mar 18 16:08:04 crc kubenswrapper[4939]: I0318 16:08:04.379181 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:04 crc kubenswrapper[4939]: I0318 16:08:04.392831 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlz49\" (UniqueName: \"kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49\") pod \"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136\" (UID: \"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136\") " Mar 18 16:08:04 crc kubenswrapper[4939]: I0318 16:08:04.397439 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49" (OuterVolumeSpecName: "kube-api-access-qlz49") pod "fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" (UID: "fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136"). InnerVolumeSpecName "kube-api-access-qlz49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:08:04 crc kubenswrapper[4939]: I0318 16:08:04.494413 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlz49\" (UniqueName: \"kubernetes.io/projected/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136-kube-api-access-qlz49\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.132587 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:08:05 crc kubenswrapper[4939]: E0318 16:08:05.132905 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.135853 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" event={"ID":"fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136","Type":"ContainerDied","Data":"f98392425838c08ff05405ac577dfba11841f7549c039fbe20bc9f0ecc6befa0"} Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.135900 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98392425838c08ff05405ac577dfba11841f7549c039fbe20bc9f0ecc6befa0" Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.135916 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-b6rd4" Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.456017 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-cz5hj"] Mar 18 16:08:05 crc kubenswrapper[4939]: I0318 16:08:05.464575 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-cz5hj"] Mar 18 16:08:06 crc kubenswrapper[4939]: I0318 16:08:06.142055 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a582e2-2e9d-4499-8c53-04b7d5de9704" path="/var/lib/kubelet/pods/f4a582e2-2e9d-4499-8c53-04b7d5de9704/volumes" Mar 18 16:08:18 crc kubenswrapper[4939]: I0318 16:08:18.133764 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:08:18 crc kubenswrapper[4939]: E0318 16:08:18.134377 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:08:30 crc kubenswrapper[4939]: I0318 16:08:30.134545 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:08:30 crc kubenswrapper[4939]: E0318 16:08:30.135736 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:08:43 crc kubenswrapper[4939]: I0318 16:08:43.133341 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:08:43 crc kubenswrapper[4939]: E0318 16:08:43.134829 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:08:53 crc kubenswrapper[4939]: I0318 16:08:53.418944 4939 scope.go:117] "RemoveContainer" containerID="ffa557caefa00ed451e073426ac3014e675a404548733cc936435b5abb184d29" Mar 18 16:08:53 crc kubenswrapper[4939]: I0318 16:08:53.464113 4939 scope.go:117] "RemoveContainer" containerID="3a6be7d31af70d20f3d1ed115c27e44245c15f8fcf4f832e8ac84984eb9faac3" Mar 18 16:08:53 crc kubenswrapper[4939]: I0318 16:08:53.506094 4939 scope.go:117] "RemoveContainer" containerID="c8b177f35e38719eb0605090d30ecb234f6c72120376163fb7c98a38321a3d5c" Mar 18 16:08:53 crc kubenswrapper[4939]: I0318 16:08:53.538630 4939 scope.go:117] "RemoveContainer" containerID="807f71a2590b251e7c5b9dfc7934283a4894cf19375dc95bfe6ab5ab6f25a1f4" Mar 18 16:08:53 crc kubenswrapper[4939]: I0318 16:08:53.577100 4939 scope.go:117] "RemoveContainer" containerID="71000f98adb8392199a87db579844540c2369c38e83f05ef7e6212b75f62b671" Mar 18 16:08:56 crc kubenswrapper[4939]: I0318 16:08:56.136056 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:08:56 crc kubenswrapper[4939]: E0318 16:08:56.136487 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:09:09 crc kubenswrapper[4939]: I0318 16:09:09.134300 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:09:09 crc kubenswrapper[4939]: E0318 16:09:09.136380 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:09:24 crc kubenswrapper[4939]: I0318 16:09:24.134383 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:09:24 crc kubenswrapper[4939]: E0318 16:09:24.136118 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:09:36 crc kubenswrapper[4939]: I0318 16:09:36.136970 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:09:36 crc kubenswrapper[4939]: E0318 16:09:36.137697 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.710448 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:09:45 crc kubenswrapper[4939]: E0318 16:09:45.711087 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" containerName="oc" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.711105 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" containerName="oc" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.711301 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" containerName="oc" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.713695 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.719568 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.834325 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.834387 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk29c\" (UniqueName: \"kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.834415 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.935629 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.935708 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk29c\" (UniqueName: \"kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.935746 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.936181 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.936213 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:45 crc kubenswrapper[4939]: I0318 16:09:45.955586 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk29c\" (UniqueName: \"kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c\") pod \"redhat-operators-knvb7\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:46 crc kubenswrapper[4939]: I0318 16:09:46.037252 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:09:46 crc kubenswrapper[4939]: I0318 16:09:46.500962 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:09:46 crc kubenswrapper[4939]: I0318 16:09:46.997411 4939 generic.go:334] "Generic (PLEG): container finished" podID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerID="ac8d968e4ce2fdb80208533ea3d0738a0bb51e9d6c244e3798b7f7afa27174bb" exitCode=0 Mar 18 16:09:46 crc kubenswrapper[4939]: I0318 16:09:46.997514 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerDied","Data":"ac8d968e4ce2fdb80208533ea3d0738a0bb51e9d6c244e3798b7f7afa27174bb"} Mar 18 16:09:46 crc kubenswrapper[4939]: I0318 16:09:46.998768 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerStarted","Data":"803efb2f69248456f9170dcd0843f8ec48463d9590ba0a7590304f10fe2ded99"} Mar 18 16:09:48 crc kubenswrapper[4939]: I0318 16:09:48.133692 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:09:48 crc kubenswrapper[4939]: E0318 16:09:48.133928 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:09:53 crc kubenswrapper[4939]: I0318 16:09:53.667708 4939 scope.go:117] "RemoveContainer" containerID="758d17fe92530594fb55767b69458f7e5b9641b4227fd7a174da37c39963bf7e" Mar 18 16:09:58 crc kubenswrapper[4939]: I0318 16:09:58.097244 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerStarted","Data":"42d05ec913c930e9d878d313bf8537f212dab0bcec329a1d9e43d7b0aa7ddb88"} Mar 18 16:09:59 crc kubenswrapper[4939]: I0318 16:09:59.108789 4939 generic.go:334] "Generic (PLEG): container finished" podID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerID="42d05ec913c930e9d878d313bf8537f212dab0bcec329a1d9e43d7b0aa7ddb88" exitCode=0 Mar 18 16:09:59 crc kubenswrapper[4939]: I0318 16:09:59.109033 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerDied","Data":"42d05ec913c930e9d878d313bf8537f212dab0bcec329a1d9e43d7b0aa7ddb88"} Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.145711 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564170-6kzgx"] Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.146552 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.155127 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-6kzgx"] Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.156471 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.156831 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.160899 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.294024 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hjn\" (UniqueName: \"kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn\") pod \"auto-csr-approver-29564170-6kzgx\" (UID: \"8047f050-4538-46d5-8edf-1e21291f00f6\") " pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.396015 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hjn\" (UniqueName: \"kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn\") pod \"auto-csr-approver-29564170-6kzgx\" (UID: \"8047f050-4538-46d5-8edf-1e21291f00f6\") " pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.425765 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hjn\" (UniqueName: \"kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn\") pod \"auto-csr-approver-29564170-6kzgx\" (UID: \"8047f050-4538-46d5-8edf-1e21291f00f6\") " pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.484458 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:00 crc kubenswrapper[4939]: I0318 16:10:00.944273 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-6kzgx"] Mar 18 16:10:01 crc kubenswrapper[4939]: I0318 16:10:01.132553 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" event={"ID":"8047f050-4538-46d5-8edf-1e21291f00f6","Type":"ContainerStarted","Data":"c4ea4395794e92834ce2f8872fc351ce4505c7c3c034c941eb159657fc4904ed"} Mar 18 16:10:01 crc kubenswrapper[4939]: I0318 16:10:01.136702 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerStarted","Data":"52b35ec40b059307b749088c82e80ab02b8eb198b2f2974eefb2f0a45dc989ed"} Mar 18 16:10:01 crc kubenswrapper[4939]: I0318 16:10:01.164474 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knvb7" podStartSLOduration=3.141477079 podStartE2EDuration="16.16445144s" podCreationTimestamp="2026-03-18 16:09:45 +0000 UTC" firstStartedPulling="2026-03-18 16:09:46.99931979 +0000 UTC m=+1951.598507411" lastFinishedPulling="2026-03-18 16:10:00.022294151 +0000 UTC m=+1964.621481772" observedRunningTime="2026-03-18 16:10:01.15646273 +0000 UTC m=+1965.755650351" watchObservedRunningTime="2026-03-18 16:10:01.16445144 +0000 UTC m=+1965.763639061" Mar 18 16:10:02 crc kubenswrapper[4939]: I0318 16:10:02.133451 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:10:02 crc kubenswrapper[4939]: E0318 16:10:02.133704 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:10:06 crc kubenswrapper[4939]: I0318 16:10:06.045697 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:10:06 crc kubenswrapper[4939]: I0318 16:10:06.046316 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:10:07 crc kubenswrapper[4939]: I0318 16:10:07.103804 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knvb7" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:07 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:07 crc kubenswrapper[4939]: > Mar 18 16:10:07 crc kubenswrapper[4939]: I0318 16:10:07.186229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" event={"ID":"8047f050-4538-46d5-8edf-1e21291f00f6","Type":"ContainerStarted","Data":"e526954ada3cbb3868f69a563ae33b84ea3f32455dcbfcf18a76ce58ec48c5c4"} Mar 18 16:10:09 crc kubenswrapper[4939]: I0318 16:10:09.216448 4939 generic.go:334] "Generic (PLEG): container finished" podID="8047f050-4538-46d5-8edf-1e21291f00f6" containerID="e526954ada3cbb3868f69a563ae33b84ea3f32455dcbfcf18a76ce58ec48c5c4" exitCode=0 Mar 18 16:10:09 crc kubenswrapper[4939]: I0318 16:10:09.216693 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" event={"ID":"8047f050-4538-46d5-8edf-1e21291f00f6","Type":"ContainerDied","Data":"e526954ada3cbb3868f69a563ae33b84ea3f32455dcbfcf18a76ce58ec48c5c4"} Mar 18 16:10:10 crc kubenswrapper[4939]: I0318 16:10:10.507299 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:10 crc kubenswrapper[4939]: I0318 16:10:10.607579 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8hjn\" (UniqueName: \"kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn\") pod \"8047f050-4538-46d5-8edf-1e21291f00f6\" (UID: \"8047f050-4538-46d5-8edf-1e21291f00f6\") " Mar 18 16:10:10 crc kubenswrapper[4939]: I0318 16:10:10.613739 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn" (OuterVolumeSpecName: "kube-api-access-l8hjn") pod "8047f050-4538-46d5-8edf-1e21291f00f6" (UID: "8047f050-4538-46d5-8edf-1e21291f00f6"). InnerVolumeSpecName "kube-api-access-l8hjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:10 crc kubenswrapper[4939]: I0318 16:10:10.709440 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8hjn\" (UniqueName: \"kubernetes.io/projected/8047f050-4538-46d5-8edf-1e21291f00f6-kube-api-access-l8hjn\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:11 crc kubenswrapper[4939]: I0318 16:10:11.232398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" event={"ID":"8047f050-4538-46d5-8edf-1e21291f00f6","Type":"ContainerDied","Data":"c4ea4395794e92834ce2f8872fc351ce4505c7c3c034c941eb159657fc4904ed"} Mar 18 16:10:11 crc kubenswrapper[4939]: I0318 16:10:11.232451 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4ea4395794e92834ce2f8872fc351ce4505c7c3c034c941eb159657fc4904ed" Mar 18 16:10:11 crc kubenswrapper[4939]: I0318 16:10:11.232469 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-6kzgx" Mar 18 16:10:11 crc kubenswrapper[4939]: I0318 16:10:11.573816 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-8ldlb"] Mar 18 16:10:11 crc kubenswrapper[4939]: I0318 16:10:11.579485 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-8ldlb"] Mar 18 16:10:12 crc kubenswrapper[4939]: I0318 16:10:12.140630 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c790a937-6999-4aef-adb0-d7ff057c7e03" path="/var/lib/kubelet/pods/c790a937-6999-4aef-adb0-d7ff057c7e03/volumes" Mar 18 16:10:16 crc kubenswrapper[4939]: I0318 16:10:16.166489 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:10:16 crc kubenswrapper[4939]: I0318 16:10:16.211935 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.005636 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.046464 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.046806 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r86xf" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="registry-server" containerID="cri-o://aac46ba8f8dbb47c1ec077b7f7afe2a19dd05cbe13287723809b110361994f0d" gracePeriod=2 Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.132806 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:10:17 crc kubenswrapper[4939]: E0318 16:10:17.133043 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.308739 4939 generic.go:334] "Generic (PLEG): container finished" podID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerID="aac46ba8f8dbb47c1ec077b7f7afe2a19dd05cbe13287723809b110361994f0d" exitCode=0 Mar 18 16:10:17 crc kubenswrapper[4939]: I0318 16:10:17.309484 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerDied","Data":"aac46ba8f8dbb47c1ec077b7f7afe2a19dd05cbe13287723809b110361994f0d"} Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.405172 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.557175 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities\") pod \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.557220 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6qms\" (UniqueName: \"kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms\") pod \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.557262 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content\") pod \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\" (UID: \"da34fc0b-1bbf-40ea-aa12-536963bcac3a\") " Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.557799 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities" (OuterVolumeSpecName: "utilities") pod "da34fc0b-1bbf-40ea-aa12-536963bcac3a" (UID: "da34fc0b-1bbf-40ea-aa12-536963bcac3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.576357 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms" (OuterVolumeSpecName: "kube-api-access-t6qms") pod "da34fc0b-1bbf-40ea-aa12-536963bcac3a" (UID: "da34fc0b-1bbf-40ea-aa12-536963bcac3a"). InnerVolumeSpecName "kube-api-access-t6qms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.659433 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.659482 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6qms\" (UniqueName: \"kubernetes.io/projected/da34fc0b-1bbf-40ea-aa12-536963bcac3a-kube-api-access-t6qms\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.674154 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da34fc0b-1bbf-40ea-aa12-536963bcac3a" (UID: "da34fc0b-1bbf-40ea-aa12-536963bcac3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:18 crc kubenswrapper[4939]: I0318 16:10:18.760457 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da34fc0b-1bbf-40ea-aa12-536963bcac3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.326024 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r86xf" event={"ID":"da34fc0b-1bbf-40ea-aa12-536963bcac3a","Type":"ContainerDied","Data":"b25b784dba148374c46c7dfe9ddedc09c48a138a24383360ea3b3c6d05e12df3"} Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.326391 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r86xf" Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.326425 4939 scope.go:117] "RemoveContainer" containerID="aac46ba8f8dbb47c1ec077b7f7afe2a19dd05cbe13287723809b110361994f0d" Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.361450 4939 scope.go:117] "RemoveContainer" containerID="2958f486e59c88a81b55051d15a9fdeaf1fb6b4f4b2635c9ad6b3647671e82eb" Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.362256 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.370059 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r86xf"] Mar 18 16:10:19 crc kubenswrapper[4939]: I0318 16:10:19.403728 4939 scope.go:117] "RemoveContainer" containerID="fa3f687c31e512fe3b23789960ec8fa000d2d83b2140a9a744d079de5cabdbae" Mar 18 16:10:20 crc kubenswrapper[4939]: I0318 16:10:20.141055 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" path="/var/lib/kubelet/pods/da34fc0b-1bbf-40ea-aa12-536963bcac3a/volumes" Mar 18 16:10:31 crc kubenswrapper[4939]: I0318 16:10:31.133622 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:10:31 crc kubenswrapper[4939]: I0318 16:10:31.404436 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22"} Mar 18 16:10:57 crc kubenswrapper[4939]: I0318 16:10:57.455716 4939 scope.go:117] "RemoveContainer" containerID="3d6d83ff55d826e4e02a854aa36ac39355d6aaebfc2ea2f89b42123e1185987e" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.150361 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564172-7tqck"] Mar 18 16:12:00 crc kubenswrapper[4939]: E0318 16:12:00.151242 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="registry-server" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151261 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="registry-server" Mar 18 16:12:00 crc kubenswrapper[4939]: E0318 16:12:00.151289 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="extract-content" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151297 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="extract-content" Mar 18 16:12:00 crc kubenswrapper[4939]: E0318 16:12:00.151311 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="extract-utilities" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151319 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="extract-utilities" Mar 18 16:12:00 crc kubenswrapper[4939]: E0318 16:12:00.151333 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8047f050-4538-46d5-8edf-1e21291f00f6" containerName="oc" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151340 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8047f050-4538-46d5-8edf-1e21291f00f6" containerName="oc" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151544 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="da34fc0b-1bbf-40ea-aa12-536963bcac3a" containerName="registry-server" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.151559 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8047f050-4538-46d5-8edf-1e21291f00f6" containerName="oc" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.152157 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.156117 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.156430 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.156730 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.167405 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-7tqck"] Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.258273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhncs\" (UniqueName: \"kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs\") pod \"auto-csr-approver-29564172-7tqck\" (UID: \"b37bd540-d47b-4bb3-8a9d-ec7e8e234354\") " pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.359937 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhncs\" (UniqueName: \"kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs\") pod \"auto-csr-approver-29564172-7tqck\" (UID: \"b37bd540-d47b-4bb3-8a9d-ec7e8e234354\") " pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.382973 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhncs\" (UniqueName: \"kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs\") pod \"auto-csr-approver-29564172-7tqck\" (UID: \"b37bd540-d47b-4bb3-8a9d-ec7e8e234354\") " pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:00 crc kubenswrapper[4939]: I0318 16:12:00.475283 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:01 crc kubenswrapper[4939]: I0318 16:12:01.031408 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-7tqck"] Mar 18 16:12:01 crc kubenswrapper[4939]: I0318 16:12:01.609007 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-7tqck" event={"ID":"b37bd540-d47b-4bb3-8a9d-ec7e8e234354","Type":"ContainerStarted","Data":"26b424d8d8c376807c0c99622ce8eea9b67c44fde6fdc7ac6b1839607de38460"} Mar 18 16:12:02 crc kubenswrapper[4939]: I0318 16:12:02.618908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-7tqck" event={"ID":"b37bd540-d47b-4bb3-8a9d-ec7e8e234354","Type":"ContainerStarted","Data":"53c9df5db830cbac1dc2767ba4d01f9886c8d0977d8b633ead00a8b418623d49"} Mar 18 16:12:02 crc kubenswrapper[4939]: I0318 16:12:02.642356 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564172-7tqck" podStartSLOduration=1.5029350620000002 podStartE2EDuration="2.642312508s" podCreationTimestamp="2026-03-18 16:12:00 +0000 UTC" firstStartedPulling="2026-03-18 16:12:01.038231562 +0000 UTC m=+2085.637419183" lastFinishedPulling="2026-03-18 16:12:02.177608978 +0000 UTC m=+2086.776796629" observedRunningTime="2026-03-18 16:12:02.639390241 +0000 UTC m=+2087.238577862" watchObservedRunningTime="2026-03-18 16:12:02.642312508 +0000 UTC m=+2087.241500129" Mar 18 16:12:03 crc kubenswrapper[4939]: I0318 16:12:03.627468 4939 generic.go:334] "Generic (PLEG): container finished" podID="b37bd540-d47b-4bb3-8a9d-ec7e8e234354" containerID="53c9df5db830cbac1dc2767ba4d01f9886c8d0977d8b633ead00a8b418623d49" exitCode=0 Mar 18 16:12:03 crc kubenswrapper[4939]: I0318 16:12:03.627534 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-7tqck" event={"ID":"b37bd540-d47b-4bb3-8a9d-ec7e8e234354","Type":"ContainerDied","Data":"53c9df5db830cbac1dc2767ba4d01f9886c8d0977d8b633ead00a8b418623d49"} Mar 18 16:12:04 crc kubenswrapper[4939]: I0318 16:12:04.967232 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.057162 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhncs\" (UniqueName: \"kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs\") pod \"b37bd540-d47b-4bb3-8a9d-ec7e8e234354\" (UID: \"b37bd540-d47b-4bb3-8a9d-ec7e8e234354\") " Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.063178 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs" (OuterVolumeSpecName: "kube-api-access-dhncs") pod "b37bd540-d47b-4bb3-8a9d-ec7e8e234354" (UID: "b37bd540-d47b-4bb3-8a9d-ec7e8e234354"). InnerVolumeSpecName "kube-api-access-dhncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.158574 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhncs\" (UniqueName: \"kubernetes.io/projected/b37bd540-d47b-4bb3-8a9d-ec7e8e234354-kube-api-access-dhncs\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.641360 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-7tqck" event={"ID":"b37bd540-d47b-4bb3-8a9d-ec7e8e234354","Type":"ContainerDied","Data":"26b424d8d8c376807c0c99622ce8eea9b67c44fde6fdc7ac6b1839607de38460"} Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.641400 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26b424d8d8c376807c0c99622ce8eea9b67c44fde6fdc7ac6b1839607de38460" Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.641412 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-7tqck" Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.700431 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-vfvwz"] Mar 18 16:12:05 crc kubenswrapper[4939]: I0318 16:12:05.705363 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-vfvwz"] Mar 18 16:12:06 crc kubenswrapper[4939]: I0318 16:12:06.158236 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3905c518-e10e-4e70-b6f7-c83165de3204" path="/var/lib/kubelet/pods/3905c518-e10e-4e70-b6f7-c83165de3204/volumes" Mar 18 16:12:53 crc kubenswrapper[4939]: I0318 16:12:53.687282 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:12:53 crc kubenswrapper[4939]: I0318 16:12:53.687828 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:12:57 crc kubenswrapper[4939]: I0318 16:12:57.557231 4939 scope.go:117] "RemoveContainer" containerID="dcd6ef28fe896422064f4f9596a864968af451ec67660b6e712ff3edb94f6d41" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.160882 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:02 crc kubenswrapper[4939]: E0318 16:13:02.161603 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37bd540-d47b-4bb3-8a9d-ec7e8e234354" containerName="oc" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.161618 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37bd540-d47b-4bb3-8a9d-ec7e8e234354" containerName="oc" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.161799 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37bd540-d47b-4bb3-8a9d-ec7e8e234354" containerName="oc" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.162957 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.171789 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.171929 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7dw\" (UniqueName: \"kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.171967 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.182388 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.273440 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.273570 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7dw\" (UniqueName: \"kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.273599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.274153 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.274554 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.295406 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7dw\" (UniqueName: \"kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw\") pod \"redhat-marketplace-h5ht5\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.483906 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:02 crc kubenswrapper[4939]: I0318 16:13:02.913474 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:03 crc kubenswrapper[4939]: I0318 16:13:03.026230 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerStarted","Data":"829cba934c4066cf0c2c3ab1a1d2799a1f899ca0680f3b42b8a4e06ec9e77cd8"} Mar 18 16:13:04 crc kubenswrapper[4939]: I0318 16:13:04.035732 4939 generic.go:334] "Generic (PLEG): container finished" podID="7ec77332-8c93-43a7-b631-46816c985304" containerID="6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2" exitCode=0 Mar 18 16:13:04 crc kubenswrapper[4939]: I0318 16:13:04.035805 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerDied","Data":"6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2"} Mar 18 16:13:04 crc kubenswrapper[4939]: I0318 16:13:04.038135 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:13:05 crc kubenswrapper[4939]: I0318 16:13:05.046605 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerStarted","Data":"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073"} Mar 18 16:13:06 crc kubenswrapper[4939]: I0318 16:13:06.055348 4939 generic.go:334] "Generic (PLEG): container finished" podID="7ec77332-8c93-43a7-b631-46816c985304" containerID="0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073" exitCode=0 Mar 18 16:13:06 crc kubenswrapper[4939]: I0318 16:13:06.055370 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerDied","Data":"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073"} Mar 18 16:13:07 crc kubenswrapper[4939]: I0318 16:13:07.066024 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerStarted","Data":"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7"} Mar 18 16:13:07 crc kubenswrapper[4939]: I0318 16:13:07.089916 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5ht5" podStartSLOduration=2.651817945 podStartE2EDuration="5.089889905s" podCreationTimestamp="2026-03-18 16:13:02 +0000 UTC" firstStartedPulling="2026-03-18 16:13:04.03785069 +0000 UTC m=+2148.637038321" lastFinishedPulling="2026-03-18 16:13:06.47592266 +0000 UTC m=+2151.075110281" observedRunningTime="2026-03-18 16:13:07.087073085 +0000 UTC m=+2151.686260706" watchObservedRunningTime="2026-03-18 16:13:07.089889905 +0000 UTC m=+2151.689077526" Mar 18 16:13:12 crc kubenswrapper[4939]: I0318 16:13:12.484669 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:12 crc kubenswrapper[4939]: I0318 16:13:12.485162 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:12 crc kubenswrapper[4939]: I0318 16:13:12.529939 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:13 crc kubenswrapper[4939]: I0318 16:13:13.141990 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:13 crc kubenswrapper[4939]: I0318 16:13:13.181485 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.115968 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5ht5" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="registry-server" containerID="cri-o://66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7" gracePeriod=2 Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.523573 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.663320 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d7dw\" (UniqueName: \"kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw\") pod \"7ec77332-8c93-43a7-b631-46816c985304\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.663387 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content\") pod \"7ec77332-8c93-43a7-b631-46816c985304\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.663425 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities\") pod \"7ec77332-8c93-43a7-b631-46816c985304\" (UID: \"7ec77332-8c93-43a7-b631-46816c985304\") " Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.664568 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities" (OuterVolumeSpecName: "utilities") pod "7ec77332-8c93-43a7-b631-46816c985304" (UID: "7ec77332-8c93-43a7-b631-46816c985304"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.669100 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw" (OuterVolumeSpecName: "kube-api-access-4d7dw") pod "7ec77332-8c93-43a7-b631-46816c985304" (UID: "7ec77332-8c93-43a7-b631-46816c985304"). InnerVolumeSpecName "kube-api-access-4d7dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.764950 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d7dw\" (UniqueName: \"kubernetes.io/projected/7ec77332-8c93-43a7-b631-46816c985304-kube-api-access-4d7dw\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.764996 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.811689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ec77332-8c93-43a7-b631-46816c985304" (UID: "7ec77332-8c93-43a7-b631-46816c985304"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4939]: I0318 16:13:15.866328 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec77332-8c93-43a7-b631-46816c985304-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.126390 4939 generic.go:334] "Generic (PLEG): container finished" podID="7ec77332-8c93-43a7-b631-46816c985304" containerID="66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7" exitCode=0 Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.126440 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerDied","Data":"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7"} Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.126470 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5ht5" event={"ID":"7ec77332-8c93-43a7-b631-46816c985304","Type":"ContainerDied","Data":"829cba934c4066cf0c2c3ab1a1d2799a1f899ca0680f3b42b8a4e06ec9e77cd8"} Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.126491 4939 scope.go:117] "RemoveContainer" containerID="66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.126648 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5ht5" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.145129 4939 scope.go:117] "RemoveContainer" containerID="0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.160215 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.169002 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5ht5"] Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.171619 4939 scope.go:117] "RemoveContainer" containerID="6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.198775 4939 scope.go:117] "RemoveContainer" containerID="66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7" Mar 18 16:13:16 crc kubenswrapper[4939]: E0318 16:13:16.199183 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7\": container with ID starting with 66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7 not found: ID does not exist" containerID="66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.199241 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7"} err="failed to get container status \"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7\": rpc error: code = NotFound desc = could not find container \"66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7\": container with ID starting with 66f7a782048c692144da339ff3f49b4e5952f17dd264d9ff0c0701275eb8d3a7 not found: ID does not exist" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.199269 4939 scope.go:117] "RemoveContainer" containerID="0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073" Mar 18 16:13:16 crc kubenswrapper[4939]: E0318 16:13:16.199524 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073\": container with ID starting with 0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073 not found: ID does not exist" containerID="0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.199575 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073"} err="failed to get container status \"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073\": rpc error: code = NotFound desc = could not find container \"0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073\": container with ID starting with 0f2e00b05714febcea1838c5cf90af3179737788cd23031b2d69cc4e7b88f073 not found: ID does not exist" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.199608 4939 scope.go:117] "RemoveContainer" containerID="6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2" Mar 18 16:13:16 crc kubenswrapper[4939]: E0318 16:13:16.199889 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2\": container with ID starting with 6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2 not found: ID does not exist" containerID="6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2" Mar 18 16:13:16 crc kubenswrapper[4939]: I0318 16:13:16.199913 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2"} err="failed to get container status \"6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2\": rpc error: code = NotFound desc = could not find container \"6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2\": container with ID starting with 6ce7f415b85eb87713ee8dad1a3d2b6c95bff25a869d99742d136195aa1092e2 not found: ID does not exist" Mar 18 16:13:18 crc kubenswrapper[4939]: I0318 16:13:18.148431 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec77332-8c93-43a7-b631-46816c985304" path="/var/lib/kubelet/pods/7ec77332-8c93-43a7-b631-46816c985304/volumes" Mar 18 16:13:23 crc kubenswrapper[4939]: I0318 16:13:23.686970 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:13:23 crc kubenswrapper[4939]: I0318 16:13:23.687262 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:13:53 crc kubenswrapper[4939]: I0318 16:13:53.687209 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:13:53 crc kubenswrapper[4939]: I0318 16:13:53.687789 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:13:53 crc kubenswrapper[4939]: I0318 16:13:53.687838 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:13:53 crc kubenswrapper[4939]: I0318 16:13:53.688470 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:13:53 crc kubenswrapper[4939]: I0318 16:13:53.688546 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22" gracePeriod=600 Mar 18 16:13:54 crc kubenswrapper[4939]: I0318 16:13:54.377207 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22" exitCode=0 Mar 18 16:13:54 crc kubenswrapper[4939]: I0318 16:13:54.377252 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22"} Mar 18 16:13:54 crc kubenswrapper[4939]: I0318 16:13:54.377480 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc"} Mar 18 16:13:54 crc kubenswrapper[4939]: I0318 16:13:54.377523 4939 scope.go:117] "RemoveContainer" containerID="53612fd971cd880b7634c8dd05050b7fcc5fb5dc64c1b72a747e8b8a64ee0dfd" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.145023 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564174-9sc8m"] Mar 18 16:14:00 crc kubenswrapper[4939]: E0318 16:14:00.145861 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="registry-server" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.145874 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="registry-server" Mar 18 16:14:00 crc kubenswrapper[4939]: E0318 16:14:00.145895 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="extract-utilities" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.145901 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="extract-utilities" Mar 18 16:14:00 crc kubenswrapper[4939]: E0318 16:14:00.145913 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="extract-content" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.145918 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="extract-content" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.146039 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec77332-8c93-43a7-b631-46816c985304" containerName="registry-server" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.146573 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.150231 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.150244 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.150324 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.153526 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-9sc8m"] Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.311432 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rv4d\" (UniqueName: \"kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d\") pod \"auto-csr-approver-29564174-9sc8m\" (UID: \"8513bca3-396b-4ae6-8934-7ecc842564c6\") " pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.413190 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rv4d\" (UniqueName: \"kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d\") pod \"auto-csr-approver-29564174-9sc8m\" (UID: \"8513bca3-396b-4ae6-8934-7ecc842564c6\") " pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.435565 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rv4d\" (UniqueName: \"kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d\") pod \"auto-csr-approver-29564174-9sc8m\" (UID: \"8513bca3-396b-4ae6-8934-7ecc842564c6\") " pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.464935 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:00 crc kubenswrapper[4939]: I0318 16:14:00.954308 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-9sc8m"] Mar 18 16:14:01 crc kubenswrapper[4939]: I0318 16:14:01.430884 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" event={"ID":"8513bca3-396b-4ae6-8934-7ecc842564c6","Type":"ContainerStarted","Data":"36c79077904fef312bb58d9cd621358eee643c8f4a633ebdbf0853d9408add46"} Mar 18 16:14:03 crc kubenswrapper[4939]: I0318 16:14:03.448647 4939 generic.go:334] "Generic (PLEG): container finished" podID="8513bca3-396b-4ae6-8934-7ecc842564c6" containerID="769ccf99fd5c35c461f9f242c0301bbf9b7099a1c2ba58815ee7bd233ad8077f" exitCode=0 Mar 18 16:14:03 crc kubenswrapper[4939]: I0318 16:14:03.448750 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" event={"ID":"8513bca3-396b-4ae6-8934-7ecc842564c6","Type":"ContainerDied","Data":"769ccf99fd5c35c461f9f242c0301bbf9b7099a1c2ba58815ee7bd233ad8077f"} Mar 18 16:14:04 crc kubenswrapper[4939]: I0318 16:14:04.698866 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:04 crc kubenswrapper[4939]: I0318 16:14:04.770831 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rv4d\" (UniqueName: \"kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d\") pod \"8513bca3-396b-4ae6-8934-7ecc842564c6\" (UID: \"8513bca3-396b-4ae6-8934-7ecc842564c6\") " Mar 18 16:14:04 crc kubenswrapper[4939]: I0318 16:14:04.778089 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d" (OuterVolumeSpecName: "kube-api-access-6rv4d") pod "8513bca3-396b-4ae6-8934-7ecc842564c6" (UID: "8513bca3-396b-4ae6-8934-7ecc842564c6"). InnerVolumeSpecName "kube-api-access-6rv4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:14:04 crc kubenswrapper[4939]: I0318 16:14:04.872529 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rv4d\" (UniqueName: \"kubernetes.io/projected/8513bca3-396b-4ae6-8934-7ecc842564c6-kube-api-access-6rv4d\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:05 crc kubenswrapper[4939]: I0318 16:14:05.465992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" event={"ID":"8513bca3-396b-4ae6-8934-7ecc842564c6","Type":"ContainerDied","Data":"36c79077904fef312bb58d9cd621358eee643c8f4a633ebdbf0853d9408add46"} Mar 18 16:14:05 crc kubenswrapper[4939]: I0318 16:14:05.466266 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c79077904fef312bb58d9cd621358eee643c8f4a633ebdbf0853d9408add46" Mar 18 16:14:05 crc kubenswrapper[4939]: I0318 16:14:05.466038 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-9sc8m" Mar 18 16:14:05 crc kubenswrapper[4939]: I0318 16:14:05.776531 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-b6rd4"] Mar 18 16:14:05 crc kubenswrapper[4939]: I0318 16:14:05.783154 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-b6rd4"] Mar 18 16:14:06 crc kubenswrapper[4939]: I0318 16:14:06.143223 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136" path="/var/lib/kubelet/pods/fc5bfd8b-ffb2-43e7-9bbd-9b08e6f36136/volumes" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.666555 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:26 crc kubenswrapper[4939]: E0318 16:14:26.667740 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8513bca3-396b-4ae6-8934-7ecc842564c6" containerName="oc" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.667779 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8513bca3-396b-4ae6-8934-7ecc842564c6" containerName="oc" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.668051 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8513bca3-396b-4ae6-8934-7ecc842564c6" containerName="oc" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.669830 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.681467 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.778815 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.778899 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfhxm\" (UniqueName: \"kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.778988 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.880238 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.880295 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.880371 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfhxm\" (UniqueName: \"kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.881169 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.881418 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.917980 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfhxm\" (UniqueName: \"kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm\") pod \"community-operators-glf8j\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:26 crc kubenswrapper[4939]: I0318 16:14:26.991893 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:27 crc kubenswrapper[4939]: I0318 16:14:27.536428 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:27 crc kubenswrapper[4939]: I0318 16:14:27.623229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerStarted","Data":"0890b588a540b9bd682f65dc5f34146ae3976c27196d92fe8031b16bd0803dcc"} Mar 18 16:14:28 crc kubenswrapper[4939]: I0318 16:14:28.631567 4939 generic.go:334] "Generic (PLEG): container finished" podID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerID="6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3" exitCode=0 Mar 18 16:14:28 crc kubenswrapper[4939]: I0318 16:14:28.631639 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerDied","Data":"6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3"} Mar 18 16:14:29 crc kubenswrapper[4939]: I0318 16:14:29.640543 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerStarted","Data":"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf"} Mar 18 16:14:30 crc kubenswrapper[4939]: I0318 16:14:30.648497 4939 generic.go:334] "Generic (PLEG): container finished" podID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerID="f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf" exitCode=0 Mar 18 16:14:30 crc kubenswrapper[4939]: I0318 16:14:30.648592 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerDied","Data":"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf"} Mar 18 16:14:31 crc kubenswrapper[4939]: I0318 16:14:31.657348 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerStarted","Data":"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31"} Mar 18 16:14:31 crc kubenswrapper[4939]: I0318 16:14:31.681278 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-glf8j" podStartSLOduration=2.970150346 podStartE2EDuration="5.681252252s" podCreationTimestamp="2026-03-18 16:14:26 +0000 UTC" firstStartedPulling="2026-03-18 16:14:28.633089547 +0000 UTC m=+2233.232277168" lastFinishedPulling="2026-03-18 16:14:31.344191453 +0000 UTC m=+2235.943379074" observedRunningTime="2026-03-18 16:14:31.678853274 +0000 UTC m=+2236.278040915" watchObservedRunningTime="2026-03-18 16:14:31.681252252 +0000 UTC m=+2236.280439873" Mar 18 16:14:36 crc kubenswrapper[4939]: I0318 16:14:36.993055 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:36 crc kubenswrapper[4939]: I0318 16:14:36.993632 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:37 crc kubenswrapper[4939]: I0318 16:14:37.035195 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:37 crc kubenswrapper[4939]: I0318 16:14:37.733057 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:37 crc kubenswrapper[4939]: I0318 16:14:37.784655 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:39 crc kubenswrapper[4939]: I0318 16:14:39.706138 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-glf8j" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="registry-server" containerID="cri-o://152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31" gracePeriod=2 Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.136858 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.266667 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfhxm\" (UniqueName: \"kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm\") pod \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.266743 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content\") pod \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.266863 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities\") pod \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\" (UID: \"b98bf8b4-8c88-4a3f-970e-ef297f6d123b\") " Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.267915 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities" (OuterVolumeSpecName: "utilities") pod "b98bf8b4-8c88-4a3f-970e-ef297f6d123b" (UID: "b98bf8b4-8c88-4a3f-970e-ef297f6d123b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.286893 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm" (OuterVolumeSpecName: "kube-api-access-vfhxm") pod "b98bf8b4-8c88-4a3f-970e-ef297f6d123b" (UID: "b98bf8b4-8c88-4a3f-970e-ef297f6d123b"). InnerVolumeSpecName "kube-api-access-vfhxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.370452 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.370494 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfhxm\" (UniqueName: \"kubernetes.io/projected/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-kube-api-access-vfhxm\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.718453 4939 generic.go:334] "Generic (PLEG): container finished" podID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerID="152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31" exitCode=0 Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.718523 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerDied","Data":"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31"} Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.718798 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glf8j" event={"ID":"b98bf8b4-8c88-4a3f-970e-ef297f6d123b","Type":"ContainerDied","Data":"0890b588a540b9bd682f65dc5f34146ae3976c27196d92fe8031b16bd0803dcc"} Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.718830 4939 scope.go:117] "RemoveContainer" containerID="152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.718531 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glf8j" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.737231 4939 scope.go:117] "RemoveContainer" containerID="f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.762460 4939 scope.go:117] "RemoveContainer" containerID="6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.804083 4939 scope.go:117] "RemoveContainer" containerID="152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31" Mar 18 16:14:40 crc kubenswrapper[4939]: E0318 16:14:40.804526 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31\": container with ID starting with 152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31 not found: ID does not exist" containerID="152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.804587 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31"} err="failed to get container status \"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31\": rpc error: code = NotFound desc = could not find container \"152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31\": container with ID starting with 152dcd80e90b2b41cbdaf71a0f774a511778941ac9d20cc7fd932c3c69c78b31 not found: ID does not exist" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.804620 4939 scope.go:117] "RemoveContainer" containerID="f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf" Mar 18 16:14:40 crc kubenswrapper[4939]: E0318 16:14:40.805314 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf\": container with ID starting with f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf not found: ID does not exist" containerID="f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.805348 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf"} err="failed to get container status \"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf\": rpc error: code = NotFound desc = could not find container \"f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf\": container with ID starting with f97e6681213d1e0f700cb3876cb62eeaded97456c768fb277b43dee074432fdf not found: ID does not exist" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.805368 4939 scope.go:117] "RemoveContainer" containerID="6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3" Mar 18 16:14:40 crc kubenswrapper[4939]: E0318 16:14:40.805725 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3\": container with ID starting with 6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3 not found: ID does not exist" containerID="6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.805754 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3"} err="failed to get container status \"6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3\": rpc error: code = NotFound desc = could not find container \"6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3\": container with ID starting with 6908ca855e99b7894c83f426a60837673b678fe786a744a0b9bb55725e4e38a3 not found: ID does not exist" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.837322 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b98bf8b4-8c88-4a3f-970e-ef297f6d123b" (UID: "b98bf8b4-8c88-4a3f-970e-ef297f6d123b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:14:40 crc kubenswrapper[4939]: I0318 16:14:40.877573 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b98bf8b4-8c88-4a3f-970e-ef297f6d123b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:41 crc kubenswrapper[4939]: I0318 16:14:41.066496 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:41 crc kubenswrapper[4939]: I0318 16:14:41.072268 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-glf8j"] Mar 18 16:14:42 crc kubenswrapper[4939]: I0318 16:14:42.141533 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" path="/var/lib/kubelet/pods/b98bf8b4-8c88-4a3f-970e-ef297f6d123b/volumes" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.679115 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:14:47 crc kubenswrapper[4939]: E0318 16:14:47.679702 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="extract-utilities" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.679717 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="extract-utilities" Mar 18 16:14:47 crc kubenswrapper[4939]: E0318 16:14:47.679738 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="extract-content" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.679746 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="extract-content" Mar 18 16:14:47 crc kubenswrapper[4939]: E0318 16:14:47.679759 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="registry-server" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.679765 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="registry-server" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.679912 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98bf8b4-8c88-4a3f-970e-ef297f6d123b" containerName="registry-server" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.680939 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.692074 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.780039 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.780108 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwt78\" (UniqueName: \"kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.780205 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.881337 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.881402 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwt78\" (UniqueName: \"kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.881452 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.882179 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.882195 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:47 crc kubenswrapper[4939]: I0318 16:14:47.901352 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwt78\" (UniqueName: \"kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78\") pod \"certified-operators-7z49q\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:48 crc kubenswrapper[4939]: I0318 16:14:48.005348 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:48 crc kubenswrapper[4939]: I0318 16:14:48.483540 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:14:48 crc kubenswrapper[4939]: I0318 16:14:48.776606 4939 generic.go:334] "Generic (PLEG): container finished" podID="4e997217-290c-424f-99d8-d5643d83055c" containerID="d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e" exitCode=0 Mar 18 16:14:48 crc kubenswrapper[4939]: I0318 16:14:48.776659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerDied","Data":"d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e"} Mar 18 16:14:48 crc kubenswrapper[4939]: I0318 16:14:48.776911 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerStarted","Data":"e4c7fd65e014967aa569a4273eaae7623647afec175b71480d358643b511eccb"} Mar 18 16:14:50 crc kubenswrapper[4939]: I0318 16:14:50.790817 4939 generic.go:334] "Generic (PLEG): container finished" podID="4e997217-290c-424f-99d8-d5643d83055c" containerID="92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a" exitCode=0 Mar 18 16:14:50 crc kubenswrapper[4939]: I0318 16:14:50.790912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerDied","Data":"92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a"} Mar 18 16:14:51 crc kubenswrapper[4939]: I0318 16:14:51.889757 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerStarted","Data":"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631"} Mar 18 16:14:51 crc kubenswrapper[4939]: I0318 16:14:51.929648 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7z49q" podStartSLOduration=2.5096045350000002 podStartE2EDuration="4.929600721s" podCreationTimestamp="2026-03-18 16:14:47 +0000 UTC" firstStartedPulling="2026-03-18 16:14:48.778032331 +0000 UTC m=+2253.377219942" lastFinishedPulling="2026-03-18 16:14:51.198028507 +0000 UTC m=+2255.797216128" observedRunningTime="2026-03-18 16:14:51.923518198 +0000 UTC m=+2256.522705819" watchObservedRunningTime="2026-03-18 16:14:51.929600721 +0000 UTC m=+2256.528788332" Mar 18 16:14:57 crc kubenswrapper[4939]: I0318 16:14:57.647852 4939 scope.go:117] "RemoveContainer" containerID="37ddddb18f6e38bc19d18b8fa4143aec1c152c4a4a029d551a965aef21fb6822" Mar 18 16:14:58 crc kubenswrapper[4939]: I0318 16:14:58.006257 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:58 crc kubenswrapper[4939]: I0318 16:14:58.006438 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:58 crc kubenswrapper[4939]: I0318 16:14:58.047964 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:58 crc kubenswrapper[4939]: I0318 16:14:58.986498 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:14:59 crc kubenswrapper[4939]: I0318 16:14:59.044201 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.145590 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4"] Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.146448 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.148294 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.149202 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.165545 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4"] Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.250802 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.250871 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbk9p\" (UniqueName: \"kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.251104 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.352256 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.352314 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbk9p\" (UniqueName: \"kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.352383 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.354701 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.358075 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.370159 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbk9p\" (UniqueName: \"kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p\") pod \"collect-profiles-29564175-bkzb4\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.466331 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.863893 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4"] Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.958979 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" event={"ID":"67eacf98-54a3-405f-8c43-e805fdce4a12","Type":"ContainerStarted","Data":"aa52c38964893516fb2400b51a513f6c80933041ea2521f4de8729f1a7815c33"} Mar 18 16:15:00 crc kubenswrapper[4939]: I0318 16:15:00.959311 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7z49q" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="registry-server" containerID="cri-o://651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631" gracePeriod=2 Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.425794 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.494138 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities\") pod \"4e997217-290c-424f-99d8-d5643d83055c\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.494592 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content\") pod \"4e997217-290c-424f-99d8-d5643d83055c\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.494733 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwt78\" (UniqueName: \"kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78\") pod \"4e997217-290c-424f-99d8-d5643d83055c\" (UID: \"4e997217-290c-424f-99d8-d5643d83055c\") " Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.495329 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities" (OuterVolumeSpecName: "utilities") pod "4e997217-290c-424f-99d8-d5643d83055c" (UID: "4e997217-290c-424f-99d8-d5643d83055c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.506250 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78" (OuterVolumeSpecName: "kube-api-access-kwt78") pod "4e997217-290c-424f-99d8-d5643d83055c" (UID: "4e997217-290c-424f-99d8-d5643d83055c"). InnerVolumeSpecName "kube-api-access-kwt78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.596622 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.596659 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwt78\" (UniqueName: \"kubernetes.io/projected/4e997217-290c-424f-99d8-d5643d83055c-kube-api-access-kwt78\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.853151 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e997217-290c-424f-99d8-d5643d83055c" (UID: "4e997217-290c-424f-99d8-d5643d83055c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.899945 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e997217-290c-424f-99d8-d5643d83055c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.968275 4939 generic.go:334] "Generic (PLEG): container finished" podID="4e997217-290c-424f-99d8-d5643d83055c" containerID="651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631" exitCode=0 Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.968361 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7z49q" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.968360 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerDied","Data":"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631"} Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.968436 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7z49q" event={"ID":"4e997217-290c-424f-99d8-d5643d83055c","Type":"ContainerDied","Data":"e4c7fd65e014967aa569a4273eaae7623647afec175b71480d358643b511eccb"} Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.968475 4939 scope.go:117] "RemoveContainer" containerID="651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631" Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.971759 4939 generic.go:334] "Generic (PLEG): container finished" podID="67eacf98-54a3-405f-8c43-e805fdce4a12" containerID="24763a9dfc06050bd51c0531da4c66d785c619f72e793717a9240f4c49866986" exitCode=0 Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.971808 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" event={"ID":"67eacf98-54a3-405f-8c43-e805fdce4a12","Type":"ContainerDied","Data":"24763a9dfc06050bd51c0531da4c66d785c619f72e793717a9240f4c49866986"} Mar 18 16:15:01 crc kubenswrapper[4939]: I0318 16:15:01.989712 4939 scope.go:117] "RemoveContainer" containerID="92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.018572 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.021300 4939 scope.go:117] "RemoveContainer" containerID="d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.024988 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7z49q"] Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.043775 4939 scope.go:117] "RemoveContainer" containerID="651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631" Mar 18 16:15:02 crc kubenswrapper[4939]: E0318 16:15:02.044248 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631\": container with ID starting with 651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631 not found: ID does not exist" containerID="651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.044281 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631"} err="failed to get container status \"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631\": rpc error: code = NotFound desc = could not find container \"651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631\": container with ID starting with 651a3031e81098155c45ff65902e42b4aec406adb289ec508f34847680887631 not found: ID does not exist" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.044304 4939 scope.go:117] "RemoveContainer" containerID="92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a" Mar 18 16:15:02 crc kubenswrapper[4939]: E0318 16:15:02.044795 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a\": container with ID starting with 92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a not found: ID does not exist" containerID="92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.044836 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a"} err="failed to get container status \"92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a\": rpc error: code = NotFound desc = could not find container \"92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a\": container with ID starting with 92854ecb659521ceb55535e04782660a878c6b0c71300f328fbc0d1e6ac0a17a not found: ID does not exist" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.044862 4939 scope.go:117] "RemoveContainer" containerID="d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e" Mar 18 16:15:02 crc kubenswrapper[4939]: E0318 16:15:02.045333 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e\": container with ID starting with d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e not found: ID does not exist" containerID="d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.045363 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e"} err="failed to get container status \"d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e\": rpc error: code = NotFound desc = could not find container \"d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e\": container with ID starting with d04ceeb5d7caad618eb1eff2990239f02e056a75373484cc446c045265375f6e not found: ID does not exist" Mar 18 16:15:02 crc kubenswrapper[4939]: I0318 16:15:02.141586 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e997217-290c-424f-99d8-d5643d83055c" path="/var/lib/kubelet/pods/4e997217-290c-424f-99d8-d5643d83055c/volumes" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.259023 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.417123 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume\") pod \"67eacf98-54a3-405f-8c43-e805fdce4a12\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.417201 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbk9p\" (UniqueName: \"kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p\") pod \"67eacf98-54a3-405f-8c43-e805fdce4a12\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.417272 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume\") pod \"67eacf98-54a3-405f-8c43-e805fdce4a12\" (UID: \"67eacf98-54a3-405f-8c43-e805fdce4a12\") " Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.417966 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume" (OuterVolumeSpecName: "config-volume") pod "67eacf98-54a3-405f-8c43-e805fdce4a12" (UID: "67eacf98-54a3-405f-8c43-e805fdce4a12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.427384 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p" (OuterVolumeSpecName: "kube-api-access-xbk9p") pod "67eacf98-54a3-405f-8c43-e805fdce4a12" (UID: "67eacf98-54a3-405f-8c43-e805fdce4a12"). InnerVolumeSpecName "kube-api-access-xbk9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.428039 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67eacf98-54a3-405f-8c43-e805fdce4a12" (UID: "67eacf98-54a3-405f-8c43-e805fdce4a12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.518723 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67eacf98-54a3-405f-8c43-e805fdce4a12-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.518759 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67eacf98-54a3-405f-8c43-e805fdce4a12-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.518772 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbk9p\" (UniqueName: \"kubernetes.io/projected/67eacf98-54a3-405f-8c43-e805fdce4a12-kube-api-access-xbk9p\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.987302 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" event={"ID":"67eacf98-54a3-405f-8c43-e805fdce4a12","Type":"ContainerDied","Data":"aa52c38964893516fb2400b51a513f6c80933041ea2521f4de8729f1a7815c33"} Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.987348 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa52c38964893516fb2400b51a513f6c80933041ea2521f4de8729f1a7815c33" Mar 18 16:15:03 crc kubenswrapper[4939]: I0318 16:15:03.987368 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4" Mar 18 16:15:04 crc kubenswrapper[4939]: I0318 16:15:04.334123 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d"] Mar 18 16:15:04 crc kubenswrapper[4939]: I0318 16:15:04.339495 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-nt62d"] Mar 18 16:15:06 crc kubenswrapper[4939]: I0318 16:15:06.146661 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f6faf0-bf2d-4679-bd53-5a1f529ad2de" path="/var/lib/kubelet/pods/b5f6faf0-bf2d-4679-bd53-5a1f529ad2de/volumes" Mar 18 16:15:53 crc kubenswrapper[4939]: I0318 16:15:53.687324 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:15:53 crc kubenswrapper[4939]: I0318 16:15:53.688026 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:15:57 crc kubenswrapper[4939]: I0318 16:15:57.721233 4939 scope.go:117] "RemoveContainer" containerID="60447d0394151a5ef4832f2882a562382589b6ce445fefd29fb1b412eba7df48" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.151856 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564176-ww67d"] Mar 18 16:16:00 crc kubenswrapper[4939]: E0318 16:16:00.152405 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="extract-utilities" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152420 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="extract-utilities" Mar 18 16:16:00 crc kubenswrapper[4939]: E0318 16:16:00.152442 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="registry-server" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152450 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="registry-server" Mar 18 16:16:00 crc kubenswrapper[4939]: E0318 16:16:00.152465 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67eacf98-54a3-405f-8c43-e805fdce4a12" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152473 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="67eacf98-54a3-405f-8c43-e805fdce4a12" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4939]: E0318 16:16:00.152488 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="extract-content" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152495 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="extract-content" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152699 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="67eacf98-54a3-405f-8c43-e805fdce4a12" containerName="collect-profiles" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.152719 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e997217-290c-424f-99d8-d5643d83055c" containerName="registry-server" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.153239 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.155975 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.156577 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.156885 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.166224 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-ww67d"] Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.225691 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbr4c\" (UniqueName: \"kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c\") pod \"auto-csr-approver-29564176-ww67d\" (UID: \"4226eb5d-14ba-49d3-abd4-fa787717fa97\") " pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.328180 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbr4c\" (UniqueName: \"kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c\") pod \"auto-csr-approver-29564176-ww67d\" (UID: \"4226eb5d-14ba-49d3-abd4-fa787717fa97\") " pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.348203 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbr4c\" (UniqueName: \"kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c\") pod \"auto-csr-approver-29564176-ww67d\" (UID: \"4226eb5d-14ba-49d3-abd4-fa787717fa97\") " pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.511917 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:00 crc kubenswrapper[4939]: I0318 16:16:00.938903 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-ww67d"] Mar 18 16:16:01 crc kubenswrapper[4939]: I0318 16:16:01.395213 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-ww67d" event={"ID":"4226eb5d-14ba-49d3-abd4-fa787717fa97","Type":"ContainerStarted","Data":"3c518dc217a4600b3188e0b34a0d8ece044f6fe05d30fc86e83defbb13a78549"} Mar 18 16:16:02 crc kubenswrapper[4939]: I0318 16:16:02.407726 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-ww67d" event={"ID":"4226eb5d-14ba-49d3-abd4-fa787717fa97","Type":"ContainerStarted","Data":"c39aecf374608dc9e27df109e7390662f562c867014e45c96585b3e610417d0d"} Mar 18 16:16:02 crc kubenswrapper[4939]: I0318 16:16:02.430317 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564176-ww67d" podStartSLOduration=1.361217218 podStartE2EDuration="2.430295433s" podCreationTimestamp="2026-03-18 16:16:00 +0000 UTC" firstStartedPulling="2026-03-18 16:16:00.943210764 +0000 UTC m=+2325.542398385" lastFinishedPulling="2026-03-18 16:16:02.012288979 +0000 UTC m=+2326.611476600" observedRunningTime="2026-03-18 16:16:02.424959821 +0000 UTC m=+2327.024147452" watchObservedRunningTime="2026-03-18 16:16:02.430295433 +0000 UTC m=+2327.029483054" Mar 18 16:16:03 crc kubenswrapper[4939]: I0318 16:16:03.418759 4939 generic.go:334] "Generic (PLEG): container finished" podID="4226eb5d-14ba-49d3-abd4-fa787717fa97" containerID="c39aecf374608dc9e27df109e7390662f562c867014e45c96585b3e610417d0d" exitCode=0 Mar 18 16:16:03 crc kubenswrapper[4939]: I0318 16:16:03.418839 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-ww67d" event={"ID":"4226eb5d-14ba-49d3-abd4-fa787717fa97","Type":"ContainerDied","Data":"c39aecf374608dc9e27df109e7390662f562c867014e45c96585b3e610417d0d"} Mar 18 16:16:04 crc kubenswrapper[4939]: I0318 16:16:04.700601 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:04 crc kubenswrapper[4939]: I0318 16:16:04.789475 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbr4c\" (UniqueName: \"kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c\") pod \"4226eb5d-14ba-49d3-abd4-fa787717fa97\" (UID: \"4226eb5d-14ba-49d3-abd4-fa787717fa97\") " Mar 18 16:16:04 crc kubenswrapper[4939]: I0318 16:16:04.796160 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c" (OuterVolumeSpecName: "kube-api-access-wbr4c") pod "4226eb5d-14ba-49d3-abd4-fa787717fa97" (UID: "4226eb5d-14ba-49d3-abd4-fa787717fa97"). InnerVolumeSpecName "kube-api-access-wbr4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:16:04 crc kubenswrapper[4939]: I0318 16:16:04.891380 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbr4c\" (UniqueName: \"kubernetes.io/projected/4226eb5d-14ba-49d3-abd4-fa787717fa97-kube-api-access-wbr4c\") on node \"crc\" DevicePath \"\"" Mar 18 16:16:05 crc kubenswrapper[4939]: I0318 16:16:05.435106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-ww67d" event={"ID":"4226eb5d-14ba-49d3-abd4-fa787717fa97","Type":"ContainerDied","Data":"3c518dc217a4600b3188e0b34a0d8ece044f6fe05d30fc86e83defbb13a78549"} Mar 18 16:16:05 crc kubenswrapper[4939]: I0318 16:16:05.435147 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-ww67d" Mar 18 16:16:05 crc kubenswrapper[4939]: I0318 16:16:05.435153 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c518dc217a4600b3188e0b34a0d8ece044f6fe05d30fc86e83defbb13a78549" Mar 18 16:16:05 crc kubenswrapper[4939]: I0318 16:16:05.483120 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-6kzgx"] Mar 18 16:16:05 crc kubenswrapper[4939]: I0318 16:16:05.488415 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-6kzgx"] Mar 18 16:16:06 crc kubenswrapper[4939]: I0318 16:16:06.142066 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8047f050-4538-46d5-8edf-1e21291f00f6" path="/var/lib/kubelet/pods/8047f050-4538-46d5-8edf-1e21291f00f6/volumes" Mar 18 16:16:23 crc kubenswrapper[4939]: I0318 16:16:23.686799 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:16:23 crc kubenswrapper[4939]: I0318 16:16:23.687265 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:16:53 crc kubenswrapper[4939]: I0318 16:16:53.687535 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:16:53 crc kubenswrapper[4939]: I0318 16:16:53.688079 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:16:53 crc kubenswrapper[4939]: I0318 16:16:53.688125 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:16:53 crc kubenswrapper[4939]: I0318 16:16:53.688616 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:16:53 crc kubenswrapper[4939]: I0318 16:16:53.688673 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" gracePeriod=600 Mar 18 16:16:53 crc kubenswrapper[4939]: E0318 16:16:53.831180 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:16:54 crc kubenswrapper[4939]: I0318 16:16:54.774267 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" exitCode=0 Mar 18 16:16:54 crc kubenswrapper[4939]: I0318 16:16:54.774346 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc"} Mar 18 16:16:54 crc kubenswrapper[4939]: I0318 16:16:54.774404 4939 scope.go:117] "RemoveContainer" containerID="61d99ce942c1cf400a71b7c7d09eda17e0a71591a320fa898cb74dcff5232a22" Mar 18 16:16:54 crc kubenswrapper[4939]: I0318 16:16:54.775886 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:16:54 crc kubenswrapper[4939]: E0318 16:16:54.776295 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:16:57 crc kubenswrapper[4939]: I0318 16:16:57.773769 4939 scope.go:117] "RemoveContainer" containerID="e526954ada3cbb3868f69a563ae33b84ea3f32455dcbfcf18a76ce58ec48c5c4" Mar 18 16:17:09 crc kubenswrapper[4939]: I0318 16:17:09.133030 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:17:09 crc kubenswrapper[4939]: E0318 16:17:09.135186 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:17:22 crc kubenswrapper[4939]: I0318 16:17:22.134041 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:17:22 crc kubenswrapper[4939]: E0318 16:17:22.134949 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:17:36 crc kubenswrapper[4939]: I0318 16:17:36.137796 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:17:36 crc kubenswrapper[4939]: E0318 16:17:36.138445 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:17:48 crc kubenswrapper[4939]: I0318 16:17:48.133747 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:17:48 crc kubenswrapper[4939]: E0318 16:17:48.134388 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:17:59 crc kubenswrapper[4939]: I0318 16:17:59.132863 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:17:59 crc kubenswrapper[4939]: E0318 16:17:59.133568 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.141939 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564178-hdwxb"] Mar 18 16:18:00 crc kubenswrapper[4939]: E0318 16:18:00.142184 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4226eb5d-14ba-49d3-abd4-fa787717fa97" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.142196 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4226eb5d-14ba-49d3-abd4-fa787717fa97" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.142329 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4226eb5d-14ba-49d3-abd4-fa787717fa97" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.142863 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.145419 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.145289 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.149734 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.153899 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-hdwxb"] Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.292667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfwp\" (UniqueName: \"kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp\") pod \"auto-csr-approver-29564178-hdwxb\" (UID: \"e204b240-c727-4ef2-89ca-2b866aedfd0d\") " pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.394150 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfwp\" (UniqueName: \"kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp\") pod \"auto-csr-approver-29564178-hdwxb\" (UID: \"e204b240-c727-4ef2-89ca-2b866aedfd0d\") " pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.413756 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfwp\" (UniqueName: \"kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp\") pod \"auto-csr-approver-29564178-hdwxb\" (UID: \"e204b240-c727-4ef2-89ca-2b866aedfd0d\") " pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.461964 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:00 crc kubenswrapper[4939]: I0318 16:18:00.893998 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-hdwxb"] Mar 18 16:18:01 crc kubenswrapper[4939]: I0318 16:18:01.239832 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" event={"ID":"e204b240-c727-4ef2-89ca-2b866aedfd0d","Type":"ContainerStarted","Data":"93c4ed1776adaff49313b73aaaf065ed8188add81fda8571f1dfb84004cf0792"} Mar 18 16:18:03 crc kubenswrapper[4939]: I0318 16:18:03.253746 4939 generic.go:334] "Generic (PLEG): container finished" podID="e204b240-c727-4ef2-89ca-2b866aedfd0d" containerID="069d796b29f7e20a3a97e2701e5f63be13ea78ef11e598d21c2d90f7be91d4fd" exitCode=0 Mar 18 16:18:03 crc kubenswrapper[4939]: I0318 16:18:03.253811 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" event={"ID":"e204b240-c727-4ef2-89ca-2b866aedfd0d","Type":"ContainerDied","Data":"069d796b29f7e20a3a97e2701e5f63be13ea78ef11e598d21c2d90f7be91d4fd"} Mar 18 16:18:04 crc kubenswrapper[4939]: I0318 16:18:04.521153 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:04 crc kubenswrapper[4939]: I0318 16:18:04.653111 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpfwp\" (UniqueName: \"kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp\") pod \"e204b240-c727-4ef2-89ca-2b866aedfd0d\" (UID: \"e204b240-c727-4ef2-89ca-2b866aedfd0d\") " Mar 18 16:18:04 crc kubenswrapper[4939]: I0318 16:18:04.659488 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp" (OuterVolumeSpecName: "kube-api-access-mpfwp") pod "e204b240-c727-4ef2-89ca-2b866aedfd0d" (UID: "e204b240-c727-4ef2-89ca-2b866aedfd0d"). InnerVolumeSpecName "kube-api-access-mpfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:18:04 crc kubenswrapper[4939]: I0318 16:18:04.754877 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpfwp\" (UniqueName: \"kubernetes.io/projected/e204b240-c727-4ef2-89ca-2b866aedfd0d-kube-api-access-mpfwp\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:05 crc kubenswrapper[4939]: I0318 16:18:05.272187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" event={"ID":"e204b240-c727-4ef2-89ca-2b866aedfd0d","Type":"ContainerDied","Data":"93c4ed1776adaff49313b73aaaf065ed8188add81fda8571f1dfb84004cf0792"} Mar 18 16:18:05 crc kubenswrapper[4939]: I0318 16:18:05.272256 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93c4ed1776adaff49313b73aaaf065ed8188add81fda8571f1dfb84004cf0792" Mar 18 16:18:05 crc kubenswrapper[4939]: I0318 16:18:05.272264 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-hdwxb" Mar 18 16:18:05 crc kubenswrapper[4939]: I0318 16:18:05.589271 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-7tqck"] Mar 18 16:18:05 crc kubenswrapper[4939]: I0318 16:18:05.595873 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-7tqck"] Mar 18 16:18:06 crc kubenswrapper[4939]: I0318 16:18:06.141706 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37bd540-d47b-4bb3-8a9d-ec7e8e234354" path="/var/lib/kubelet/pods/b37bd540-d47b-4bb3-8a9d-ec7e8e234354/volumes" Mar 18 16:18:13 crc kubenswrapper[4939]: I0318 16:18:13.133316 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:18:13 crc kubenswrapper[4939]: E0318 16:18:13.133936 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:18:24 crc kubenswrapper[4939]: I0318 16:18:24.133110 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:18:24 crc kubenswrapper[4939]: E0318 16:18:24.134178 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:18:39 crc kubenswrapper[4939]: I0318 16:18:39.132648 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:18:39 crc kubenswrapper[4939]: E0318 16:18:39.133435 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:18:53 crc kubenswrapper[4939]: I0318 16:18:53.133942 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:18:53 crc kubenswrapper[4939]: E0318 16:18:53.134920 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:18:57 crc kubenswrapper[4939]: I0318 16:18:57.864637 4939 scope.go:117] "RemoveContainer" containerID="53c9df5db830cbac1dc2767ba4d01f9886c8d0977d8b633ead00a8b418623d49" Mar 18 16:19:08 crc kubenswrapper[4939]: I0318 16:19:08.133775 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:19:08 crc kubenswrapper[4939]: E0318 16:19:08.134575 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:19:20 crc kubenswrapper[4939]: I0318 16:19:20.133862 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:19:20 crc kubenswrapper[4939]: E0318 16:19:20.134750 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:19:31 crc kubenswrapper[4939]: I0318 16:19:31.133810 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:19:31 crc kubenswrapper[4939]: E0318 16:19:31.134592 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:19:43 crc kubenswrapper[4939]: I0318 16:19:43.133750 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:19:43 crc kubenswrapper[4939]: E0318 16:19:43.134231 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:19:54 crc kubenswrapper[4939]: I0318 16:19:54.133608 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:19:54 crc kubenswrapper[4939]: E0318 16:19:54.134307 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.806817 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:19:58 crc kubenswrapper[4939]: E0318 16:19:58.807393 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e204b240-c727-4ef2-89ca-2b866aedfd0d" containerName="oc" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.807410 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e204b240-c727-4ef2-89ca-2b866aedfd0d" containerName="oc" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.807607 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e204b240-c727-4ef2-89ca-2b866aedfd0d" containerName="oc" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.808782 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.822008 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.949580 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.954696 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:58 crc kubenswrapper[4939]: I0318 16:19:58.954826 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbgc\" (UniqueName: \"kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.056709 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.056775 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.056809 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbgc\" (UniqueName: \"kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.057407 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.057458 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.080207 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbgc\" (UniqueName: \"kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc\") pod \"redhat-operators-7ktrn\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.127983 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:19:59 crc kubenswrapper[4939]: I0318 16:19:59.589552 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.106175 4939 generic.go:334] "Generic (PLEG): container finished" podID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerID="eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237" exitCode=0 Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.106238 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerDied","Data":"eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237"} Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.106322 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerStarted","Data":"9e8b20101fa462f65408b3aedf8835a3a356ed350c5b73f67319c022b9831cc4"} Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.108689 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.160808 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564180-qzsm2"] Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.161732 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.164167 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.164894 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.165262 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.169846 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggpn\" (UniqueName: \"kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn\") pod \"auto-csr-approver-29564180-qzsm2\" (UID: \"a3e3650f-50cc-4e32-9f5a-9d6570bc7695\") " pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.171040 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-qzsm2"] Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.271018 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggpn\" (UniqueName: \"kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn\") pod \"auto-csr-approver-29564180-qzsm2\" (UID: \"a3e3650f-50cc-4e32-9f5a-9d6570bc7695\") " pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.289568 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggpn\" (UniqueName: \"kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn\") pod \"auto-csr-approver-29564180-qzsm2\" (UID: \"a3e3650f-50cc-4e32-9f5a-9d6570bc7695\") " pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.477498 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:00 crc kubenswrapper[4939]: I0318 16:20:00.902477 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-qzsm2"] Mar 18 16:20:01 crc kubenswrapper[4939]: I0318 16:20:01.114813 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" event={"ID":"a3e3650f-50cc-4e32-9f5a-9d6570bc7695","Type":"ContainerStarted","Data":"8e1afc434958beb9b765dc732269c22fecebcd8631cf2fef06ca31c4b7b0389e"} Mar 18 16:20:03 crc kubenswrapper[4939]: I0318 16:20:03.134665 4939 generic.go:334] "Generic (PLEG): container finished" podID="a3e3650f-50cc-4e32-9f5a-9d6570bc7695" containerID="465612719b9ac98f14e8b65973debbff0be1d3a56a49dd7c01d67398b18b3953" exitCode=0 Mar 18 16:20:03 crc kubenswrapper[4939]: I0318 16:20:03.134764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" event={"ID":"a3e3650f-50cc-4e32-9f5a-9d6570bc7695","Type":"ContainerDied","Data":"465612719b9ac98f14e8b65973debbff0be1d3a56a49dd7c01d67398b18b3953"} Mar 18 16:20:04 crc kubenswrapper[4939]: I0318 16:20:04.502113 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:04 crc kubenswrapper[4939]: I0318 16:20:04.647341 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggpn\" (UniqueName: \"kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn\") pod \"a3e3650f-50cc-4e32-9f5a-9d6570bc7695\" (UID: \"a3e3650f-50cc-4e32-9f5a-9d6570bc7695\") " Mar 18 16:20:04 crc kubenswrapper[4939]: I0318 16:20:04.666749 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn" (OuterVolumeSpecName: "kube-api-access-lggpn") pod "a3e3650f-50cc-4e32-9f5a-9d6570bc7695" (UID: "a3e3650f-50cc-4e32-9f5a-9d6570bc7695"). InnerVolumeSpecName "kube-api-access-lggpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:20:04 crc kubenswrapper[4939]: I0318 16:20:04.749099 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggpn\" (UniqueName: \"kubernetes.io/projected/a3e3650f-50cc-4e32-9f5a-9d6570bc7695-kube-api-access-lggpn\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:05 crc kubenswrapper[4939]: I0318 16:20:05.148055 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" event={"ID":"a3e3650f-50cc-4e32-9f5a-9d6570bc7695","Type":"ContainerDied","Data":"8e1afc434958beb9b765dc732269c22fecebcd8631cf2fef06ca31c4b7b0389e"} Mar 18 16:20:05 crc kubenswrapper[4939]: I0318 16:20:05.148379 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1afc434958beb9b765dc732269c22fecebcd8631cf2fef06ca31c4b7b0389e" Mar 18 16:20:05 crc kubenswrapper[4939]: I0318 16:20:05.148113 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-qzsm2" Mar 18 16:20:05 crc kubenswrapper[4939]: I0318 16:20:05.588098 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-9sc8m"] Mar 18 16:20:05 crc kubenswrapper[4939]: I0318 16:20:05.594439 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-9sc8m"] Mar 18 16:20:06 crc kubenswrapper[4939]: I0318 16:20:06.146139 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8513bca3-396b-4ae6-8934-7ecc842564c6" path="/var/lib/kubelet/pods/8513bca3-396b-4ae6-8934-7ecc842564c6/volumes" Mar 18 16:20:07 crc kubenswrapper[4939]: I0318 16:20:07.133466 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:20:07 crc kubenswrapper[4939]: E0318 16:20:07.133837 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:20:09 crc kubenswrapper[4939]: I0318 16:20:09.179585 4939 generic.go:334] "Generic (PLEG): container finished" podID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerID="6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e" exitCode=0 Mar 18 16:20:09 crc kubenswrapper[4939]: I0318 16:20:09.179806 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerDied","Data":"6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e"} Mar 18 16:20:10 crc kubenswrapper[4939]: I0318 16:20:10.189542 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerStarted","Data":"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a"} Mar 18 16:20:10 crc kubenswrapper[4939]: I0318 16:20:10.212015 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7ktrn" podStartSLOduration=2.744443726 podStartE2EDuration="12.211991857s" podCreationTimestamp="2026-03-18 16:19:58 +0000 UTC" firstStartedPulling="2026-03-18 16:20:00.108442441 +0000 UTC m=+2564.707630062" lastFinishedPulling="2026-03-18 16:20:09.575990572 +0000 UTC m=+2574.175178193" observedRunningTime="2026-03-18 16:20:10.207308593 +0000 UTC m=+2574.806496214" watchObservedRunningTime="2026-03-18 16:20:10.211991857 +0000 UTC m=+2574.811179478" Mar 18 16:20:18 crc kubenswrapper[4939]: I0318 16:20:18.132802 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:20:18 crc kubenswrapper[4939]: E0318 16:20:18.133471 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.128960 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.129121 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.174574 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.286068 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.354111 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.424477 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:20:19 crc kubenswrapper[4939]: I0318 16:20:19.424756 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knvb7" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="registry-server" containerID="cri-o://52b35ec40b059307b749088c82e80ab02b8eb198b2f2974eefb2f0a45dc989ed" gracePeriod=2 Mar 18 16:20:21 crc kubenswrapper[4939]: I0318 16:20:21.261584 4939 generic.go:334] "Generic (PLEG): container finished" podID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerID="52b35ec40b059307b749088c82e80ab02b8eb198b2f2974eefb2f0a45dc989ed" exitCode=0 Mar 18 16:20:21 crc kubenswrapper[4939]: I0318 16:20:21.261669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerDied","Data":"52b35ec40b059307b749088c82e80ab02b8eb198b2f2974eefb2f0a45dc989ed"} Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.150172 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.283614 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knvb7" event={"ID":"54dd23f1-b1a8-4632-8f2f-55570fc67c11","Type":"ContainerDied","Data":"803efb2f69248456f9170dcd0843f8ec48463d9590ba0a7590304f10fe2ded99"} Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.283664 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knvb7" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.283679 4939 scope.go:117] "RemoveContainer" containerID="52b35ec40b059307b749088c82e80ab02b8eb198b2f2974eefb2f0a45dc989ed" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.307369 4939 scope.go:117] "RemoveContainer" containerID="42d05ec913c930e9d878d313bf8537f212dab0bcec329a1d9e43d7b0aa7ddb88" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.333689 4939 scope.go:117] "RemoveContainer" containerID="ac8d968e4ce2fdb80208533ea3d0738a0bb51e9d6c244e3798b7f7afa27174bb" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.337954 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content\") pod \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.338128 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities\") pod \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.338165 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk29c\" (UniqueName: \"kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c\") pod \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\" (UID: \"54dd23f1-b1a8-4632-8f2f-55570fc67c11\") " Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.345736 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities" (OuterVolumeSpecName: "utilities") pod "54dd23f1-b1a8-4632-8f2f-55570fc67c11" (UID: "54dd23f1-b1a8-4632-8f2f-55570fc67c11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.351756 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c" (OuterVolumeSpecName: "kube-api-access-fk29c") pod "54dd23f1-b1a8-4632-8f2f-55570fc67c11" (UID: "54dd23f1-b1a8-4632-8f2f-55570fc67c11"). InnerVolumeSpecName "kube-api-access-fk29c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.440154 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.440189 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk29c\" (UniqueName: \"kubernetes.io/projected/54dd23f1-b1a8-4632-8f2f-55570fc67c11-kube-api-access-fk29c\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.508894 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54dd23f1-b1a8-4632-8f2f-55570fc67c11" (UID: "54dd23f1-b1a8-4632-8f2f-55570fc67c11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.540937 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54dd23f1-b1a8-4632-8f2f-55570fc67c11-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.613705 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:20:24 crc kubenswrapper[4939]: I0318 16:20:24.619325 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knvb7"] Mar 18 16:20:26 crc kubenswrapper[4939]: I0318 16:20:26.144953 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" path="/var/lib/kubelet/pods/54dd23f1-b1a8-4632-8f2f-55570fc67c11/volumes" Mar 18 16:20:32 crc kubenswrapper[4939]: I0318 16:20:32.134541 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:20:32 crc kubenswrapper[4939]: E0318 16:20:32.135469 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:20:46 crc kubenswrapper[4939]: I0318 16:20:46.137153 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:20:46 crc kubenswrapper[4939]: E0318 16:20:46.138991 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:20:57 crc kubenswrapper[4939]: I0318 16:20:57.936179 4939 scope.go:117] "RemoveContainer" containerID="769ccf99fd5c35c461f9f242c0301bbf9b7099a1c2ba58815ee7bd233ad8077f" Mar 18 16:20:59 crc kubenswrapper[4939]: I0318 16:20:59.134280 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:20:59 crc kubenswrapper[4939]: E0318 16:20:59.134975 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:21:12 crc kubenswrapper[4939]: I0318 16:21:12.134151 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:21:12 crc kubenswrapper[4939]: E0318 16:21:12.134965 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:21:25 crc kubenswrapper[4939]: I0318 16:21:25.133829 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:21:25 crc kubenswrapper[4939]: E0318 16:21:25.134614 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:21:36 crc kubenswrapper[4939]: I0318 16:21:36.138580 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:21:36 crc kubenswrapper[4939]: E0318 16:21:36.139636 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:21:50 crc kubenswrapper[4939]: I0318 16:21:50.133249 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:21:50 crc kubenswrapper[4939]: E0318 16:21:50.134963 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.148974 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564182-f4tx8"] Mar 18 16:22:00 crc kubenswrapper[4939]: E0318 16:22:00.149703 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149718 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4939]: E0318 16:22:00.149738 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="extract-content" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149746 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="extract-content" Mar 18 16:22:00 crc kubenswrapper[4939]: E0318 16:22:00.149775 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e3650f-50cc-4e32-9f5a-9d6570bc7695" containerName="oc" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149782 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e3650f-50cc-4e32-9f5a-9d6570bc7695" containerName="oc" Mar 18 16:22:00 crc kubenswrapper[4939]: E0318 16:22:00.149794 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="extract-utilities" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149800 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="extract-utilities" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149922 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e3650f-50cc-4e32-9f5a-9d6570bc7695" containerName="oc" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.149934 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="54dd23f1-b1a8-4632-8f2f-55570fc67c11" containerName="registry-server" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.150370 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.153085 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.153087 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.157843 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.161971 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-f4tx8"] Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.304571 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhgl\" (UniqueName: \"kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl\") pod \"auto-csr-approver-29564182-f4tx8\" (UID: \"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82\") " pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.406031 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhgl\" (UniqueName: \"kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl\") pod \"auto-csr-approver-29564182-f4tx8\" (UID: \"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82\") " pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.428848 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhgl\" (UniqueName: \"kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl\") pod \"auto-csr-approver-29564182-f4tx8\" (UID: \"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82\") " pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.477320 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:00 crc kubenswrapper[4939]: I0318 16:22:00.934426 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-f4tx8"] Mar 18 16:22:01 crc kubenswrapper[4939]: I0318 16:22:01.003415 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" event={"ID":"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82","Type":"ContainerStarted","Data":"e854d5b5e44ae23eeceefd8152935e5c295c1bf7881928c19a4e993c4b87aad2"} Mar 18 16:22:03 crc kubenswrapper[4939]: I0318 16:22:03.020852 4939 generic.go:334] "Generic (PLEG): container finished" podID="7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" containerID="f75284a5d805a107d0422c1c74946dabe135a2bf2200325e6430f3cf976768fc" exitCode=0 Mar 18 16:22:03 crc kubenswrapper[4939]: I0318 16:22:03.020970 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" event={"ID":"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82","Type":"ContainerDied","Data":"f75284a5d805a107d0422c1c74946dabe135a2bf2200325e6430f3cf976768fc"} Mar 18 16:22:03 crc kubenswrapper[4939]: I0318 16:22:03.133065 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:22:04 crc kubenswrapper[4939]: I0318 16:22:04.028933 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2"} Mar 18 16:22:04 crc kubenswrapper[4939]: I0318 16:22:04.297240 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:04 crc kubenswrapper[4939]: I0318 16:22:04.469130 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnhgl\" (UniqueName: \"kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl\") pod \"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82\" (UID: \"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82\") " Mar 18 16:22:04 crc kubenswrapper[4939]: I0318 16:22:04.476715 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl" (OuterVolumeSpecName: "kube-api-access-rnhgl") pod "7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" (UID: "7893dc1c-5ff5-4e5d-adba-8e21b3ecef82"). InnerVolumeSpecName "kube-api-access-rnhgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4939]: I0318 16:22:04.570757 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnhgl\" (UniqueName: \"kubernetes.io/projected/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82-kube-api-access-rnhgl\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:05 crc kubenswrapper[4939]: I0318 16:22:05.037306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" event={"ID":"7893dc1c-5ff5-4e5d-adba-8e21b3ecef82","Type":"ContainerDied","Data":"e854d5b5e44ae23eeceefd8152935e5c295c1bf7881928c19a4e993c4b87aad2"} Mar 18 16:22:05 crc kubenswrapper[4939]: I0318 16:22:05.037566 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e854d5b5e44ae23eeceefd8152935e5c295c1bf7881928c19a4e993c4b87aad2" Mar 18 16:22:05 crc kubenswrapper[4939]: I0318 16:22:05.037373 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-f4tx8" Mar 18 16:22:05 crc kubenswrapper[4939]: I0318 16:22:05.360429 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-ww67d"] Mar 18 16:22:05 crc kubenswrapper[4939]: I0318 16:22:05.365087 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-ww67d"] Mar 18 16:22:06 crc kubenswrapper[4939]: I0318 16:22:06.145680 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4226eb5d-14ba-49d3-abd4-fa787717fa97" path="/var/lib/kubelet/pods/4226eb5d-14ba-49d3-abd4-fa787717fa97/volumes" Mar 18 16:22:58 crc kubenswrapper[4939]: I0318 16:22:58.038289 4939 scope.go:117] "RemoveContainer" containerID="c39aecf374608dc9e27df109e7390662f562c867014e45c96585b3e610417d0d" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.140376 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564184-m9zvz"] Mar 18 16:24:00 crc kubenswrapper[4939]: E0318 16:24:00.141040 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" containerName="oc" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.141050 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" containerName="oc" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.141191 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" containerName="oc" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.141593 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.143307 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.143328 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.143556 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.149620 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-m9zvz"] Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.310813 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q778c\" (UniqueName: \"kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c\") pod \"auto-csr-approver-29564184-m9zvz\" (UID: \"a2d526b9-f2ae-411d-8997-2ad1c8080bc6\") " pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.412735 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q778c\" (UniqueName: \"kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c\") pod \"auto-csr-approver-29564184-m9zvz\" (UID: \"a2d526b9-f2ae-411d-8997-2ad1c8080bc6\") " pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.432731 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q778c\" (UniqueName: \"kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c\") pod \"auto-csr-approver-29564184-m9zvz\" (UID: \"a2d526b9-f2ae-411d-8997-2ad1c8080bc6\") " pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.456530 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.945355 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-m9zvz"] Mar 18 16:24:00 crc kubenswrapper[4939]: I0318 16:24:00.962480 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" event={"ID":"a2d526b9-f2ae-411d-8997-2ad1c8080bc6","Type":"ContainerStarted","Data":"520031ada0eda492bd2324b2490b682234f2b7922f7162b33530ac87d9eac1c7"} Mar 18 16:24:02 crc kubenswrapper[4939]: I0318 16:24:02.979815 4939 generic.go:334] "Generic (PLEG): container finished" podID="a2d526b9-f2ae-411d-8997-2ad1c8080bc6" containerID="48e5a58761095e5a2f8871c8fd2fa0bdf379c1e833d624ec60aff75d444ddb1d" exitCode=0 Mar 18 16:24:02 crc kubenswrapper[4939]: I0318 16:24:02.979889 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" event={"ID":"a2d526b9-f2ae-411d-8997-2ad1c8080bc6","Type":"ContainerDied","Data":"48e5a58761095e5a2f8871c8fd2fa0bdf379c1e833d624ec60aff75d444ddb1d"} Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.232937 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.365993 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q778c\" (UniqueName: \"kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c\") pod \"a2d526b9-f2ae-411d-8997-2ad1c8080bc6\" (UID: \"a2d526b9-f2ae-411d-8997-2ad1c8080bc6\") " Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.373816 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c" (OuterVolumeSpecName: "kube-api-access-q778c") pod "a2d526b9-f2ae-411d-8997-2ad1c8080bc6" (UID: "a2d526b9-f2ae-411d-8997-2ad1c8080bc6"). InnerVolumeSpecName "kube-api-access-q778c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.467678 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q778c\" (UniqueName: \"kubernetes.io/projected/a2d526b9-f2ae-411d-8997-2ad1c8080bc6-kube-api-access-q778c\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.995352 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" event={"ID":"a2d526b9-f2ae-411d-8997-2ad1c8080bc6","Type":"ContainerDied","Data":"520031ada0eda492bd2324b2490b682234f2b7922f7162b33530ac87d9eac1c7"} Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.995395 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520031ada0eda492bd2324b2490b682234f2b7922f7162b33530ac87d9eac1c7" Mar 18 16:24:04 crc kubenswrapper[4939]: I0318 16:24:04.995397 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-m9zvz" Mar 18 16:24:05 crc kubenswrapper[4939]: I0318 16:24:05.302668 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-hdwxb"] Mar 18 16:24:05 crc kubenswrapper[4939]: I0318 16:24:05.310265 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-hdwxb"] Mar 18 16:24:06 crc kubenswrapper[4939]: I0318 16:24:06.149840 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e204b240-c727-4ef2-89ca-2b866aedfd0d" path="/var/lib/kubelet/pods/e204b240-c727-4ef2-89ca-2b866aedfd0d/volumes" Mar 18 16:24:08 crc kubenswrapper[4939]: I0318 16:24:08.874878 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:08 crc kubenswrapper[4939]: E0318 16:24:08.875395 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d526b9-f2ae-411d-8997-2ad1c8080bc6" containerName="oc" Mar 18 16:24:08 crc kubenswrapper[4939]: I0318 16:24:08.875407 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d526b9-f2ae-411d-8997-2ad1c8080bc6" containerName="oc" Mar 18 16:24:08 crc kubenswrapper[4939]: I0318 16:24:08.875549 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d526b9-f2ae-411d-8997-2ad1c8080bc6" containerName="oc" Mar 18 16:24:08 crc kubenswrapper[4939]: I0318 16:24:08.876535 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:08 crc kubenswrapper[4939]: I0318 16:24:08.898756 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.033536 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.033600 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.033629 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2k4\" (UniqueName: \"kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.134340 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.134400 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.134436 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2k4\" (UniqueName: \"kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.135095 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.135198 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.159218 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2k4\" (UniqueName: \"kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4\") pod \"redhat-marketplace-gh87z\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.191066 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:09 crc kubenswrapper[4939]: I0318 16:24:09.604544 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:10 crc kubenswrapper[4939]: I0318 16:24:10.034244 4939 generic.go:334] "Generic (PLEG): container finished" podID="e24b722b-d691-49e3-8318-8bf539fab940" containerID="6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142" exitCode=0 Mar 18 16:24:10 crc kubenswrapper[4939]: I0318 16:24:10.034363 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerDied","Data":"6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142"} Mar 18 16:24:10 crc kubenswrapper[4939]: I0318 16:24:10.034624 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerStarted","Data":"8bbd50127e191581608860247c891f190b13c36af42e73b7d3515bd77485916b"} Mar 18 16:24:15 crc kubenswrapper[4939]: I0318 16:24:15.079153 4939 generic.go:334] "Generic (PLEG): container finished" podID="e24b722b-d691-49e3-8318-8bf539fab940" containerID="bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1" exitCode=0 Mar 18 16:24:15 crc kubenswrapper[4939]: I0318 16:24:15.079280 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerDied","Data":"bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1"} Mar 18 16:24:16 crc kubenswrapper[4939]: I0318 16:24:16.092974 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerStarted","Data":"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d"} Mar 18 16:24:19 crc kubenswrapper[4939]: I0318 16:24:19.191887 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:19 crc kubenswrapper[4939]: I0318 16:24:19.192760 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:19 crc kubenswrapper[4939]: I0318 16:24:19.238168 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:19 crc kubenswrapper[4939]: I0318 16:24:19.258046 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gh87z" podStartSLOduration=5.747194751 podStartE2EDuration="11.258028081s" podCreationTimestamp="2026-03-18 16:24:08 +0000 UTC" firstStartedPulling="2026-03-18 16:24:10.035492698 +0000 UTC m=+2814.634680319" lastFinishedPulling="2026-03-18 16:24:15.546326028 +0000 UTC m=+2820.145513649" observedRunningTime="2026-03-18 16:24:16.122672965 +0000 UTC m=+2820.721860626" watchObservedRunningTime="2026-03-18 16:24:19.258028081 +0000 UTC m=+2823.857215712" Mar 18 16:24:20 crc kubenswrapper[4939]: I0318 16:24:20.172887 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:20 crc kubenswrapper[4939]: I0318 16:24:20.213954 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.140790 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gh87z" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="registry-server" containerID="cri-o://8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d" gracePeriod=2 Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.553218 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.625753 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content\") pod \"e24b722b-d691-49e3-8318-8bf539fab940\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.625815 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb2k4\" (UniqueName: \"kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4\") pod \"e24b722b-d691-49e3-8318-8bf539fab940\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.625841 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities\") pod \"e24b722b-d691-49e3-8318-8bf539fab940\" (UID: \"e24b722b-d691-49e3-8318-8bf539fab940\") " Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.626906 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities" (OuterVolumeSpecName: "utilities") pod "e24b722b-d691-49e3-8318-8bf539fab940" (UID: "e24b722b-d691-49e3-8318-8bf539fab940"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.647530 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4" (OuterVolumeSpecName: "kube-api-access-zb2k4") pod "e24b722b-d691-49e3-8318-8bf539fab940" (UID: "e24b722b-d691-49e3-8318-8bf539fab940"). InnerVolumeSpecName "kube-api-access-zb2k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.660182 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e24b722b-d691-49e3-8318-8bf539fab940" (UID: "e24b722b-d691-49e3-8318-8bf539fab940"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.727842 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.728087 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb2k4\" (UniqueName: \"kubernetes.io/projected/e24b722b-d691-49e3-8318-8bf539fab940-kube-api-access-zb2k4\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:22 crc kubenswrapper[4939]: I0318 16:24:22.728098 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24b722b-d691-49e3-8318-8bf539fab940-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.152303 4939 generic.go:334] "Generic (PLEG): container finished" podID="e24b722b-d691-49e3-8318-8bf539fab940" containerID="8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d" exitCode=0 Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.152347 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerDied","Data":"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d"} Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.152374 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh87z" event={"ID":"e24b722b-d691-49e3-8318-8bf539fab940","Type":"ContainerDied","Data":"8bbd50127e191581608860247c891f190b13c36af42e73b7d3515bd77485916b"} Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.152398 4939 scope.go:117] "RemoveContainer" containerID="8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.153378 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh87z" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.168899 4939 scope.go:117] "RemoveContainer" containerID="bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.188676 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.194067 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh87z"] Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.195435 4939 scope.go:117] "RemoveContainer" containerID="6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.210070 4939 scope.go:117] "RemoveContainer" containerID="8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d" Mar 18 16:24:23 crc kubenswrapper[4939]: E0318 16:24:23.210526 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d\": container with ID starting with 8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d not found: ID does not exist" containerID="8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.210631 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d"} err="failed to get container status \"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d\": rpc error: code = NotFound desc = could not find container \"8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d\": container with ID starting with 8507fd9f48f6dd51768621f2e53011624d545c8feafdde55a8e4b3054d51526d not found: ID does not exist" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.210709 4939 scope.go:117] "RemoveContainer" containerID="bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1" Mar 18 16:24:23 crc kubenswrapper[4939]: E0318 16:24:23.211061 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1\": container with ID starting with bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1 not found: ID does not exist" containerID="bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.211082 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1"} err="failed to get container status \"bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1\": rpc error: code = NotFound desc = could not find container \"bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1\": container with ID starting with bd8e50324d8380bf2398a6b1504bb0089351969780f1f175441c10fd3fa59fb1 not found: ID does not exist" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.211097 4939 scope.go:117] "RemoveContainer" containerID="6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142" Mar 18 16:24:23 crc kubenswrapper[4939]: E0318 16:24:23.211337 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142\": container with ID starting with 6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142 not found: ID does not exist" containerID="6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.211380 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142"} err="failed to get container status \"6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142\": rpc error: code = NotFound desc = could not find container \"6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142\": container with ID starting with 6a1b9c64696c0f8baa8e4b309fe33ed137ef2415ef595ad10fd35712ce3aa142 not found: ID does not exist" Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.687418 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:24:23 crc kubenswrapper[4939]: I0318 16:24:23.687491 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:24:24 crc kubenswrapper[4939]: I0318 16:24:24.145223 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24b722b-d691-49e3-8318-8bf539fab940" path="/var/lib/kubelet/pods/e24b722b-d691-49e3-8318-8bf539fab940/volumes" Mar 18 16:24:53 crc kubenswrapper[4939]: I0318 16:24:53.687058 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:24:53 crc kubenswrapper[4939]: I0318 16:24:53.687704 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:24:58 crc kubenswrapper[4939]: I0318 16:24:58.135654 4939 scope.go:117] "RemoveContainer" containerID="069d796b29f7e20a3a97e2701e5f63be13ea78ef11e598d21c2d90f7be91d4fd" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.874532 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:17 crc kubenswrapper[4939]: E0318 16:25:17.875393 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="extract-content" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.875407 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="extract-content" Mar 18 16:25:17 crc kubenswrapper[4939]: E0318 16:25:17.875447 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="registry-server" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.875455 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="registry-server" Mar 18 16:25:17 crc kubenswrapper[4939]: E0318 16:25:17.875473 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="extract-utilities" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.875481 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="extract-utilities" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.875655 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24b722b-d691-49e3-8318-8bf539fab940" containerName="registry-server" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.876835 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.885524 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.970863 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.970967 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86t2\" (UniqueName: \"kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:17 crc kubenswrapper[4939]: I0318 16:25:17.971033 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.072732 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.072816 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.072863 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b86t2\" (UniqueName: \"kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.073709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.073960 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.092896 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86t2\" (UniqueName: \"kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2\") pod \"community-operators-gmflr\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.206704 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:18 crc kubenswrapper[4939]: I0318 16:25:18.782539 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:19 crc kubenswrapper[4939]: I0318 16:25:19.573840 4939 generic.go:334] "Generic (PLEG): container finished" podID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerID="68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d" exitCode=0 Mar 18 16:25:19 crc kubenswrapper[4939]: I0318 16:25:19.573961 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerDied","Data":"68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d"} Mar 18 16:25:19 crc kubenswrapper[4939]: I0318 16:25:19.574237 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerStarted","Data":"d024bf217ebfae5354fb937bb88450265b29ca9425ab1cd057bd3f03643fa743"} Mar 18 16:25:19 crc kubenswrapper[4939]: I0318 16:25:19.576265 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:25:22 crc kubenswrapper[4939]: I0318 16:25:22.596162 4939 generic.go:334] "Generic (PLEG): container finished" podID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerID="c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4" exitCode=0 Mar 18 16:25:22 crc kubenswrapper[4939]: I0318 16:25:22.596230 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerDied","Data":"c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4"} Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.605524 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerStarted","Data":"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab"} Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.624827 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gmflr" podStartSLOduration=3.049799649 podStartE2EDuration="6.624810537s" podCreationTimestamp="2026-03-18 16:25:17 +0000 UTC" firstStartedPulling="2026-03-18 16:25:19.575763422 +0000 UTC m=+2884.174951083" lastFinishedPulling="2026-03-18 16:25:23.15077435 +0000 UTC m=+2887.749961971" observedRunningTime="2026-03-18 16:25:23.621993176 +0000 UTC m=+2888.221180797" watchObservedRunningTime="2026-03-18 16:25:23.624810537 +0000 UTC m=+2888.223998148" Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.687638 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.687910 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.688024 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.688766 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:25:23 crc kubenswrapper[4939]: I0318 16:25:23.688942 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2" gracePeriod=600 Mar 18 16:25:24 crc kubenswrapper[4939]: I0318 16:25:24.615158 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2" exitCode=0 Mar 18 16:25:24 crc kubenswrapper[4939]: I0318 16:25:24.616711 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2"} Mar 18 16:25:24 crc kubenswrapper[4939]: I0318 16:25:24.616807 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7"} Mar 18 16:25:24 crc kubenswrapper[4939]: I0318 16:25:24.616837 4939 scope.go:117] "RemoveContainer" containerID="8649a58f5544e0080187e13d2b0ab27469d2e798ac4df50b3f0807f5bbff16cc" Mar 18 16:25:28 crc kubenswrapper[4939]: I0318 16:25:28.207056 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:28 crc kubenswrapper[4939]: I0318 16:25:28.207666 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:28 crc kubenswrapper[4939]: I0318 16:25:28.259830 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:28 crc kubenswrapper[4939]: I0318 16:25:28.703273 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:28 crc kubenswrapper[4939]: I0318 16:25:28.751257 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:30 crc kubenswrapper[4939]: I0318 16:25:30.673651 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gmflr" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="registry-server" containerID="cri-o://3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab" gracePeriod=2 Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.058676 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.167788 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b86t2\" (UniqueName: \"kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2\") pod \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.168261 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities\") pod \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.168349 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content\") pod \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\" (UID: \"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b\") " Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.169399 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities" (OuterVolumeSpecName: "utilities") pod "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" (UID: "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.172246 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.173672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2" (OuterVolumeSpecName: "kube-api-access-b86t2") pod "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" (UID: "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b"). InnerVolumeSpecName "kube-api-access-b86t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.238344 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" (UID: "48afa0fd-3819-4f70-a6c0-f3bf10c0d22b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.273134 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.273171 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b86t2\" (UniqueName: \"kubernetes.io/projected/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b-kube-api-access-b86t2\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.684927 4939 generic.go:334] "Generic (PLEG): container finished" podID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerID="3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab" exitCode=0 Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.684977 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerDied","Data":"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab"} Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.685007 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmflr" event={"ID":"48afa0fd-3819-4f70-a6c0-f3bf10c0d22b","Type":"ContainerDied","Data":"d024bf217ebfae5354fb937bb88450265b29ca9425ab1cd057bd3f03643fa743"} Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.685025 4939 scope.go:117] "RemoveContainer" containerID="3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.685150 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmflr" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.731885 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.738416 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gmflr"] Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.743333 4939 scope.go:117] "RemoveContainer" containerID="c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.801100 4939 scope.go:117] "RemoveContainer" containerID="68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.846202 4939 scope.go:117] "RemoveContainer" containerID="3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab" Mar 18 16:25:31 crc kubenswrapper[4939]: E0318 16:25:31.846703 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab\": container with ID starting with 3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab not found: ID does not exist" containerID="3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.846734 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab"} err="failed to get container status \"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab\": rpc error: code = NotFound desc = could not find container \"3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab\": container with ID starting with 3248ffbc622f171b4109c2cf4925c317866a4f867ed3ad84a4fb47c43947b7ab not found: ID does not exist" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.846756 4939 scope.go:117] "RemoveContainer" containerID="c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4" Mar 18 16:25:31 crc kubenswrapper[4939]: E0318 16:25:31.847061 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4\": container with ID starting with c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4 not found: ID does not exist" containerID="c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.847080 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4"} err="failed to get container status \"c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4\": rpc error: code = NotFound desc = could not find container \"c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4\": container with ID starting with c9319895a55b45d4b11bf5b557ea1ec6cd1187e035973472e6404540b446dfa4 not found: ID does not exist" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.847095 4939 scope.go:117] "RemoveContainer" containerID="68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d" Mar 18 16:25:31 crc kubenswrapper[4939]: E0318 16:25:31.847329 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d\": container with ID starting with 68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d not found: ID does not exist" containerID="68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d" Mar 18 16:25:31 crc kubenswrapper[4939]: I0318 16:25:31.847348 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d"} err="failed to get container status \"68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d\": rpc error: code = NotFound desc = could not find container \"68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d\": container with ID starting with 68374f3de66f4074aec26d44943c24344e153340a4f80f108c5c5a4b16448a5d not found: ID does not exist" Mar 18 16:25:32 crc kubenswrapper[4939]: I0318 16:25:32.146299 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" path="/var/lib/kubelet/pods/48afa0fd-3819-4f70-a6c0-f3bf10c0d22b/volumes" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.151041 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564186-rb2qt"] Mar 18 16:26:00 crc kubenswrapper[4939]: E0318 16:26:00.151847 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="extract-utilities" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.151864 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="extract-utilities" Mar 18 16:26:00 crc kubenswrapper[4939]: E0318 16:26:00.151922 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="extract-content" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.151934 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="extract-content" Mar 18 16:26:00 crc kubenswrapper[4939]: E0318 16:26:00.151963 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.151971 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.152211 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="48afa0fd-3819-4f70-a6c0-f3bf10c0d22b" containerName="registry-server" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.153163 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.155379 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.155682 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.155837 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.165743 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-rb2qt"] Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.317932 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mkgt\" (UniqueName: \"kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt\") pod \"auto-csr-approver-29564186-rb2qt\" (UID: \"2065de30-02ef-4491-8bf7-6f47245e108d\") " pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.419732 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mkgt\" (UniqueName: \"kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt\") pod \"auto-csr-approver-29564186-rb2qt\" (UID: \"2065de30-02ef-4491-8bf7-6f47245e108d\") " pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.454963 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mkgt\" (UniqueName: \"kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt\") pod \"auto-csr-approver-29564186-rb2qt\" (UID: \"2065de30-02ef-4491-8bf7-6f47245e108d\") " pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.507062 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:00 crc kubenswrapper[4939]: I0318 16:26:00.987630 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-rb2qt"] Mar 18 16:26:01 crc kubenswrapper[4939]: I0318 16:26:01.924659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" event={"ID":"2065de30-02ef-4491-8bf7-6f47245e108d","Type":"ContainerStarted","Data":"8fbf26b36da7febafdb840e405e4177722596febaa9a18b40370adf859bc69dc"} Mar 18 16:26:02 crc kubenswrapper[4939]: I0318 16:26:02.931567 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" event={"ID":"2065de30-02ef-4491-8bf7-6f47245e108d","Type":"ContainerStarted","Data":"b231b6c490bc894ed2afb15e35926299e2b250c7c4fa7fdf371ce58d65c6c825"} Mar 18 16:26:02 crc kubenswrapper[4939]: I0318 16:26:02.951446 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" podStartSLOduration=1.706321086 podStartE2EDuration="2.951418881s" podCreationTimestamp="2026-03-18 16:26:00 +0000 UTC" firstStartedPulling="2026-03-18 16:26:00.991483055 +0000 UTC m=+2925.590670706" lastFinishedPulling="2026-03-18 16:26:02.23658084 +0000 UTC m=+2926.835768501" observedRunningTime="2026-03-18 16:26:02.942724533 +0000 UTC m=+2927.541912184" watchObservedRunningTime="2026-03-18 16:26:02.951418881 +0000 UTC m=+2927.550606532" Mar 18 16:26:03 crc kubenswrapper[4939]: I0318 16:26:03.945259 4939 generic.go:334] "Generic (PLEG): container finished" podID="2065de30-02ef-4491-8bf7-6f47245e108d" containerID="b231b6c490bc894ed2afb15e35926299e2b250c7c4fa7fdf371ce58d65c6c825" exitCode=0 Mar 18 16:26:03 crc kubenswrapper[4939]: I0318 16:26:03.945320 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" event={"ID":"2065de30-02ef-4491-8bf7-6f47245e108d","Type":"ContainerDied","Data":"b231b6c490bc894ed2afb15e35926299e2b250c7c4fa7fdf371ce58d65c6c825"} Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.208825 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.297819 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mkgt\" (UniqueName: \"kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt\") pod \"2065de30-02ef-4491-8bf7-6f47245e108d\" (UID: \"2065de30-02ef-4491-8bf7-6f47245e108d\") " Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.307386 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt" (OuterVolumeSpecName: "kube-api-access-9mkgt") pod "2065de30-02ef-4491-8bf7-6f47245e108d" (UID: "2065de30-02ef-4491-8bf7-6f47245e108d"). InnerVolumeSpecName "kube-api-access-9mkgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.399829 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mkgt\" (UniqueName: \"kubernetes.io/projected/2065de30-02ef-4491-8bf7-6f47245e108d-kube-api-access-9mkgt\") on node \"crc\" DevicePath \"\"" Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.970704 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" event={"ID":"2065de30-02ef-4491-8bf7-6f47245e108d","Type":"ContainerDied","Data":"8fbf26b36da7febafdb840e405e4177722596febaa9a18b40370adf859bc69dc"} Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.970755 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbf26b36da7febafdb840e405e4177722596febaa9a18b40370adf859bc69dc" Mar 18 16:26:05 crc kubenswrapper[4939]: I0318 16:26:05.971556 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-rb2qt" Mar 18 16:26:06 crc kubenswrapper[4939]: I0318 16:26:06.016996 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-qzsm2"] Mar 18 16:26:06 crc kubenswrapper[4939]: I0318 16:26:06.022060 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-qzsm2"] Mar 18 16:26:06 crc kubenswrapper[4939]: I0318 16:26:06.148161 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e3650f-50cc-4e32-9f5a-9d6570bc7695" path="/var/lib/kubelet/pods/a3e3650f-50cc-4e32-9f5a-9d6570bc7695/volumes" Mar 18 16:26:58 crc kubenswrapper[4939]: I0318 16:26:58.251327 4939 scope.go:117] "RemoveContainer" containerID="465612719b9ac98f14e8b65973debbff0be1d3a56a49dd7c01d67398b18b3953" Mar 18 16:27:53 crc kubenswrapper[4939]: I0318 16:27:53.687787 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:27:53 crc kubenswrapper[4939]: I0318 16:27:53.688289 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.152334 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564188-97xxb"] Mar 18 16:28:00 crc kubenswrapper[4939]: E0318 16:28:00.153294 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2065de30-02ef-4491-8bf7-6f47245e108d" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.153314 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2065de30-02ef-4491-8bf7-6f47245e108d" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.153581 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2065de30-02ef-4491-8bf7-6f47245e108d" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.154230 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.156713 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.158667 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.159186 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.172014 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-97xxb"] Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.268317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qqh\" (UniqueName: \"kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh\") pod \"auto-csr-approver-29564188-97xxb\" (UID: \"6bbcaade-e113-4477-8e86-8e38c7a665ed\") " pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.370120 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qqh\" (UniqueName: \"kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh\") pod \"auto-csr-approver-29564188-97xxb\" (UID: \"6bbcaade-e113-4477-8e86-8e38c7a665ed\") " pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.390323 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qqh\" (UniqueName: \"kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh\") pod \"auto-csr-approver-29564188-97xxb\" (UID: \"6bbcaade-e113-4477-8e86-8e38c7a665ed\") " pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.523678 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:00 crc kubenswrapper[4939]: I0318 16:28:00.943806 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-97xxb"] Mar 18 16:28:01 crc kubenswrapper[4939]: I0318 16:28:01.300616 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-97xxb" event={"ID":"6bbcaade-e113-4477-8e86-8e38c7a665ed","Type":"ContainerStarted","Data":"cc5eb65619c582fef928db38d2d316a239e476f549ced5a20c412d9d20bf076e"} Mar 18 16:28:02 crc kubenswrapper[4939]: I0318 16:28:02.308745 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-97xxb" event={"ID":"6bbcaade-e113-4477-8e86-8e38c7a665ed","Type":"ContainerStarted","Data":"4baf30a34baf75f04ad855951b86916ea04daa07d6b27d108cb89a7c22ec4a3e"} Mar 18 16:28:02 crc kubenswrapper[4939]: I0318 16:28:02.325629 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564188-97xxb" podStartSLOduration=1.388752309 podStartE2EDuration="2.325612143s" podCreationTimestamp="2026-03-18 16:28:00 +0000 UTC" firstStartedPulling="2026-03-18 16:28:00.952304709 +0000 UTC m=+3045.551492350" lastFinishedPulling="2026-03-18 16:28:01.889164563 +0000 UTC m=+3046.488352184" observedRunningTime="2026-03-18 16:28:02.320114976 +0000 UTC m=+3046.919302597" watchObservedRunningTime="2026-03-18 16:28:02.325612143 +0000 UTC m=+3046.924799764" Mar 18 16:28:03 crc kubenswrapper[4939]: I0318 16:28:03.320853 4939 generic.go:334] "Generic (PLEG): container finished" podID="6bbcaade-e113-4477-8e86-8e38c7a665ed" containerID="4baf30a34baf75f04ad855951b86916ea04daa07d6b27d108cb89a7c22ec4a3e" exitCode=0 Mar 18 16:28:03 crc kubenswrapper[4939]: I0318 16:28:03.320919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-97xxb" event={"ID":"6bbcaade-e113-4477-8e86-8e38c7a665ed","Type":"ContainerDied","Data":"4baf30a34baf75f04ad855951b86916ea04daa07d6b27d108cb89a7c22ec4a3e"} Mar 18 16:28:04 crc kubenswrapper[4939]: I0318 16:28:04.600113 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:04 crc kubenswrapper[4939]: I0318 16:28:04.726684 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79qqh\" (UniqueName: \"kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh\") pod \"6bbcaade-e113-4477-8e86-8e38c7a665ed\" (UID: \"6bbcaade-e113-4477-8e86-8e38c7a665ed\") " Mar 18 16:28:04 crc kubenswrapper[4939]: I0318 16:28:04.732562 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh" (OuterVolumeSpecName: "kube-api-access-79qqh") pod "6bbcaade-e113-4477-8e86-8e38c7a665ed" (UID: "6bbcaade-e113-4477-8e86-8e38c7a665ed"). InnerVolumeSpecName "kube-api-access-79qqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:28:04 crc kubenswrapper[4939]: I0318 16:28:04.828345 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79qqh\" (UniqueName: \"kubernetes.io/projected/6bbcaade-e113-4477-8e86-8e38c7a665ed-kube-api-access-79qqh\") on node \"crc\" DevicePath \"\"" Mar 18 16:28:05 crc kubenswrapper[4939]: I0318 16:28:05.339215 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-97xxb" event={"ID":"6bbcaade-e113-4477-8e86-8e38c7a665ed","Type":"ContainerDied","Data":"cc5eb65619c582fef928db38d2d316a239e476f549ced5a20c412d9d20bf076e"} Mar 18 16:28:05 crc kubenswrapper[4939]: I0318 16:28:05.339281 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5eb65619c582fef928db38d2d316a239e476f549ced5a20c412d9d20bf076e" Mar 18 16:28:05 crc kubenswrapper[4939]: I0318 16:28:05.339357 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-97xxb" Mar 18 16:28:05 crc kubenswrapper[4939]: I0318 16:28:05.410755 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-f4tx8"] Mar 18 16:28:05 crc kubenswrapper[4939]: I0318 16:28:05.423411 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-f4tx8"] Mar 18 16:28:06 crc kubenswrapper[4939]: I0318 16:28:06.149747 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7893dc1c-5ff5-4e5d-adba-8e21b3ecef82" path="/var/lib/kubelet/pods/7893dc1c-5ff5-4e5d-adba-8e21b3ecef82/volumes" Mar 18 16:28:23 crc kubenswrapper[4939]: I0318 16:28:23.687055 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:28:23 crc kubenswrapper[4939]: I0318 16:28:23.687822 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:28:53 crc kubenswrapper[4939]: I0318 16:28:53.687596 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:28:53 crc kubenswrapper[4939]: I0318 16:28:53.688247 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:28:53 crc kubenswrapper[4939]: I0318 16:28:53.688310 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:28:53 crc kubenswrapper[4939]: I0318 16:28:53.689079 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:28:53 crc kubenswrapper[4939]: I0318 16:28:53.689173 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" gracePeriod=600 Mar 18 16:28:53 crc kubenswrapper[4939]: E0318 16:28:53.817769 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:28:54 crc kubenswrapper[4939]: I0318 16:28:54.743111 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" exitCode=0 Mar 18 16:28:54 crc kubenswrapper[4939]: I0318 16:28:54.743168 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7"} Mar 18 16:28:54 crc kubenswrapper[4939]: I0318 16:28:54.743288 4939 scope.go:117] "RemoveContainer" containerID="d955f5bc22411141d1879695723993d57d4696e22accaa70af9a0cda6da309e2" Mar 18 16:28:54 crc kubenswrapper[4939]: I0318 16:28:54.744117 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:28:54 crc kubenswrapper[4939]: E0318 16:28:54.744666 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:28:58 crc kubenswrapper[4939]: I0318 16:28:58.327329 4939 scope.go:117] "RemoveContainer" containerID="f75284a5d805a107d0422c1c74946dabe135a2bf2200325e6430f3cf976768fc" Mar 18 16:29:10 crc kubenswrapper[4939]: I0318 16:29:10.133217 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:29:10 crc kubenswrapper[4939]: E0318 16:29:10.134112 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:29:25 crc kubenswrapper[4939]: I0318 16:29:25.132847 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:29:25 crc kubenswrapper[4939]: E0318 16:29:25.133518 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:29:38 crc kubenswrapper[4939]: I0318 16:29:38.133227 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:29:38 crc kubenswrapper[4939]: E0318 16:29:38.134218 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:29:53 crc kubenswrapper[4939]: I0318 16:29:53.134094 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:29:53 crc kubenswrapper[4939]: E0318 16:29:53.135023 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.152103 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564190-9ssdk"] Mar 18 16:30:00 crc kubenswrapper[4939]: E0318 16:30:00.153284 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbcaade-e113-4477-8e86-8e38c7a665ed" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.153305 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbcaade-e113-4477-8e86-8e38c7a665ed" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.153659 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bbcaade-e113-4477-8e86-8e38c7a665ed" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.154613 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.158779 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.158800 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.160655 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.161947 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-9ssdk"] Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.247138 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx"] Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.248003 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.253296 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.255956 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx"] Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.257708 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.297677 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbbx\" (UniqueName: \"kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx\") pod \"auto-csr-approver-29564190-9ssdk\" (UID: \"91c7d227-304b-4e86-92ab-ee9764b97d49\") " pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.398868 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.398934 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.398978 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzz6\" (UniqueName: \"kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.399134 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbbx\" (UniqueName: \"kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx\") pod \"auto-csr-approver-29564190-9ssdk\" (UID: \"91c7d227-304b-4e86-92ab-ee9764b97d49\") " pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.419300 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbbx\" (UniqueName: \"kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx\") pod \"auto-csr-approver-29564190-9ssdk\" (UID: \"91c7d227-304b-4e86-92ab-ee9764b97d49\") " pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.477326 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.500008 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.500063 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.500092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzz6\" (UniqueName: \"kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.502076 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.509245 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.522942 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzz6\" (UniqueName: \"kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6\") pod \"collect-profiles-29564190-whgxx\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.562786 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:00 crc kubenswrapper[4939]: I0318 16:30:00.988181 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-9ssdk"] Mar 18 16:30:01 crc kubenswrapper[4939]: I0318 16:30:01.042269 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx"] Mar 18 16:30:01 crc kubenswrapper[4939]: W0318 16:30:01.049893 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f30d02_82a4_4eb5_8a25_ead90d3f76a8.slice/crio-2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f WatchSource:0}: Error finding container 2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f: Status 404 returned error can't find the container with id 2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f Mar 18 16:30:01 crc kubenswrapper[4939]: I0318 16:30:01.285174 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" event={"ID":"91c7d227-304b-4e86-92ab-ee9764b97d49","Type":"ContainerStarted","Data":"74f0b2b5e796fbe7b2d3b44bcc47f16cbb93f4d245e4876bd91c7d2a4b5d0f97"} Mar 18 16:30:01 crc kubenswrapper[4939]: I0318 16:30:01.287490 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" event={"ID":"88f30d02-82a4-4eb5-8a25-ead90d3f76a8","Type":"ContainerStarted","Data":"6a9189fc2d9c8821f21f0cabc434f14ea20488426c3e224067621372529dbabf"} Mar 18 16:30:01 crc kubenswrapper[4939]: I0318 16:30:01.287536 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" event={"ID":"88f30d02-82a4-4eb5-8a25-ead90d3f76a8","Type":"ContainerStarted","Data":"2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f"} Mar 18 16:30:01 crc kubenswrapper[4939]: I0318 16:30:01.306168 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" podStartSLOduration=1.306145444 podStartE2EDuration="1.306145444s" podCreationTimestamp="2026-03-18 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:30:01.29935408 +0000 UTC m=+3165.898541711" watchObservedRunningTime="2026-03-18 16:30:01.306145444 +0000 UTC m=+3165.905333085" Mar 18 16:30:02 crc kubenswrapper[4939]: I0318 16:30:02.296141 4939 generic.go:334] "Generic (PLEG): container finished" podID="88f30d02-82a4-4eb5-8a25-ead90d3f76a8" containerID="6a9189fc2d9c8821f21f0cabc434f14ea20488426c3e224067621372529dbabf" exitCode=0 Mar 18 16:30:02 crc kubenswrapper[4939]: I0318 16:30:02.296342 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" event={"ID":"88f30d02-82a4-4eb5-8a25-ead90d3f76a8","Type":"ContainerDied","Data":"6a9189fc2d9c8821f21f0cabc434f14ea20488426c3e224067621372529dbabf"} Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.303286 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" event={"ID":"91c7d227-304b-4e86-92ab-ee9764b97d49","Type":"ContainerStarted","Data":"68cbb6e9c16a361ce518454c1c20375b078d397fd470332d620606d6c1f8ef28"} Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.416519 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.418285 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.434191 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.614926 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.614997 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.615235 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrt6\" (UniqueName: \"kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.635637 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.716761 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrt6\" (UniqueName: \"kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.716825 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.716876 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.717432 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.717915 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.745294 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrt6\" (UniqueName: \"kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6\") pod \"certified-operators-4s8bp\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.745604 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.818418 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume\") pod \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.818484 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jzz6\" (UniqueName: \"kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6\") pod \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.818592 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume\") pod \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\" (UID: \"88f30d02-82a4-4eb5-8a25-ead90d3f76a8\") " Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.819251 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "88f30d02-82a4-4eb5-8a25-ead90d3f76a8" (UID: "88f30d02-82a4-4eb5-8a25-ead90d3f76a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.826142 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88f30d02-82a4-4eb5-8a25-ead90d3f76a8" (UID: "88f30d02-82a4-4eb5-8a25-ead90d3f76a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.826737 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6" (OuterVolumeSpecName: "kube-api-access-8jzz6") pod "88f30d02-82a4-4eb5-8a25-ead90d3f76a8" (UID: "88f30d02-82a4-4eb5-8a25-ead90d3f76a8"). InnerVolumeSpecName "kube-api-access-8jzz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.919933 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.920195 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jzz6\" (UniqueName: \"kubernetes.io/projected/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-kube-api-access-8jzz6\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4939]: I0318 16:30:03.920208 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f30d02-82a4-4eb5-8a25-ead90d3f76a8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.182068 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.311850 4939 generic.go:334] "Generic (PLEG): container finished" podID="91c7d227-304b-4e86-92ab-ee9764b97d49" containerID="68cbb6e9c16a361ce518454c1c20375b078d397fd470332d620606d6c1f8ef28" exitCode=0 Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.311896 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" event={"ID":"91c7d227-304b-4e86-92ab-ee9764b97d49","Type":"ContainerDied","Data":"68cbb6e9c16a361ce518454c1c20375b078d397fd470332d620606d6c1f8ef28"} Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.313426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerStarted","Data":"abe59f1b219e63bfa01744cf8944044548a7f3e792c3cf8137b5a1d990eaad57"} Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.315066 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" event={"ID":"88f30d02-82a4-4eb5-8a25-ead90d3f76a8","Type":"ContainerDied","Data":"2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f"} Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.315097 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5553e0cba081e13c08508de271a4c67f8dfe313cd8341b203e262c76bc0e7f" Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.315115 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx" Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.395566 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2"] Mar 18 16:30:04 crc kubenswrapper[4939]: I0318 16:30:04.400123 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-w5hx2"] Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.324104 4939 generic.go:334] "Generic (PLEG): container finished" podID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerID="bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e" exitCode=0 Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.324199 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerDied","Data":"bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e"} Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.586685 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.748308 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkbbx\" (UniqueName: \"kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx\") pod \"91c7d227-304b-4e86-92ab-ee9764b97d49\" (UID: \"91c7d227-304b-4e86-92ab-ee9764b97d49\") " Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.754065 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx" (OuterVolumeSpecName: "kube-api-access-rkbbx") pod "91c7d227-304b-4e86-92ab-ee9764b97d49" (UID: "91c7d227-304b-4e86-92ab-ee9764b97d49"). InnerVolumeSpecName "kube-api-access-rkbbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:05 crc kubenswrapper[4939]: I0318 16:30:05.850068 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkbbx\" (UniqueName: \"kubernetes.io/projected/91c7d227-304b-4e86-92ab-ee9764b97d49-kube-api-access-rkbbx\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.028101 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 16:30:06 crc kubenswrapper[4939]: E0318 16:30:06.028706 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c7d227-304b-4e86-92ab-ee9764b97d49" containerName="oc" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.028790 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c7d227-304b-4e86-92ab-ee9764b97d49" containerName="oc" Mar 18 16:30:06 crc kubenswrapper[4939]: E0318 16:30:06.028866 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f30d02-82a4-4eb5-8a25-ead90d3f76a8" containerName="collect-profiles" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.028986 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f30d02-82a4-4eb5-8a25-ead90d3f76a8" containerName="collect-profiles" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.029185 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f30d02-82a4-4eb5-8a25-ead90d3f76a8" containerName="collect-profiles" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.029293 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c7d227-304b-4e86-92ab-ee9764b97d49" containerName="oc" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.030376 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.044346 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.149306 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b727364-2e38-4dd3-9a89-fc571104ebe9" path="/var/lib/kubelet/pods/5b727364-2e38-4dd3-9a89-fc571104ebe9/volumes" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.153562 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.153861 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pthj5\" (UniqueName: \"kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.153967 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.255219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.255659 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.255840 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pthj5\" (UniqueName: \"kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.255965 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.255733 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.274344 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pthj5\" (UniqueName: \"kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5\") pod \"redhat-operators-wq6xx\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.331879 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" event={"ID":"91c7d227-304b-4e86-92ab-ee9764b97d49","Type":"ContainerDied","Data":"74f0b2b5e796fbe7b2d3b44bcc47f16cbb93f4d245e4876bd91c7d2a4b5d0f97"} Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.331930 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74f0b2b5e796fbe7b2d3b44bcc47f16cbb93f4d245e4876bd91c7d2a4b5d0f97" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.331933 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-9ssdk" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.349536 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.655514 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-m9zvz"] Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.665291 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-m9zvz"] Mar 18 16:30:06 crc kubenswrapper[4939]: I0318 16:30:06.692234 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.133412 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:30:07 crc kubenswrapper[4939]: E0318 16:30:07.133681 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.340378 4939 generic.go:334] "Generic (PLEG): container finished" podID="114d30cb-4856-4d24-919a-1726f288db27" containerID="55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3" exitCode=0 Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.340548 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerDied","Data":"55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3"} Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.340702 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerStarted","Data":"045942dcb83c57e1dd3f0fb133a46c8ed001ca48998ad0674830e956713c19c1"} Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.342991 4939 generic.go:334] "Generic (PLEG): container finished" podID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerID="a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992" exitCode=0 Mar 18 16:30:07 crc kubenswrapper[4939]: I0318 16:30:07.343076 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerDied","Data":"a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992"} Mar 18 16:30:08 crc kubenswrapper[4939]: I0318 16:30:08.141973 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d526b9-f2ae-411d-8997-2ad1c8080bc6" path="/var/lib/kubelet/pods/a2d526b9-f2ae-411d-8997-2ad1c8080bc6/volumes" Mar 18 16:30:08 crc kubenswrapper[4939]: I0318 16:30:08.355013 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerStarted","Data":"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397"} Mar 18 16:30:08 crc kubenswrapper[4939]: I0318 16:30:08.379753 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4s8bp" podStartSLOduration=2.756535638 podStartE2EDuration="5.379736595s" podCreationTimestamp="2026-03-18 16:30:03 +0000 UTC" firstStartedPulling="2026-03-18 16:30:05.325528134 +0000 UTC m=+3169.924715755" lastFinishedPulling="2026-03-18 16:30:07.948729091 +0000 UTC m=+3172.547916712" observedRunningTime="2026-03-18 16:30:08.375859344 +0000 UTC m=+3172.975046965" watchObservedRunningTime="2026-03-18 16:30:08.379736595 +0000 UTC m=+3172.978924216" Mar 18 16:30:13 crc kubenswrapper[4939]: I0318 16:30:13.746018 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:13 crc kubenswrapper[4939]: I0318 16:30:13.746548 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:13 crc kubenswrapper[4939]: I0318 16:30:13.797728 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:14 crc kubenswrapper[4939]: I0318 16:30:14.449080 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:14 crc kubenswrapper[4939]: I0318 16:30:14.494498 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:16 crc kubenswrapper[4939]: I0318 16:30:16.431409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerStarted","Data":"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed"} Mar 18 16:30:16 crc kubenswrapper[4939]: I0318 16:30:16.431566 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4s8bp" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="registry-server" containerID="cri-o://b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397" gracePeriod=2 Mar 18 16:30:16 crc kubenswrapper[4939]: I0318 16:30:16.881530 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.013148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content\") pod \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.013285 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrt6\" (UniqueName: \"kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6\") pod \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.013336 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities\") pod \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\" (UID: \"b2c0277e-ec35-44d4-9ee8-073d6a82e377\") " Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.014533 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities" (OuterVolumeSpecName: "utilities") pod "b2c0277e-ec35-44d4-9ee8-073d6a82e377" (UID: "b2c0277e-ec35-44d4-9ee8-073d6a82e377"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.021829 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6" (OuterVolumeSpecName: "kube-api-access-hcrt6") pod "b2c0277e-ec35-44d4-9ee8-073d6a82e377" (UID: "b2c0277e-ec35-44d4-9ee8-073d6a82e377"). InnerVolumeSpecName "kube-api-access-hcrt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.115938 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrt6\" (UniqueName: \"kubernetes.io/projected/b2c0277e-ec35-44d4-9ee8-073d6a82e377-kube-api-access-hcrt6\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.116021 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.441324 4939 generic.go:334] "Generic (PLEG): container finished" podID="114d30cb-4856-4d24-919a-1726f288db27" containerID="31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed" exitCode=0 Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.441398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerDied","Data":"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed"} Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.443402 4939 generic.go:334] "Generic (PLEG): container finished" podID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerID="b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397" exitCode=0 Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.443425 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerDied","Data":"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397"} Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.443448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8bp" event={"ID":"b2c0277e-ec35-44d4-9ee8-073d6a82e377","Type":"ContainerDied","Data":"abe59f1b219e63bfa01744cf8944044548a7f3e792c3cf8137b5a1d990eaad57"} Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.443465 4939 scope.go:117] "RemoveContainer" containerID="b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.443471 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8bp" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.468436 4939 scope.go:117] "RemoveContainer" containerID="a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.486697 4939 scope.go:117] "RemoveContainer" containerID="bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.518719 4939 scope.go:117] "RemoveContainer" containerID="b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397" Mar 18 16:30:17 crc kubenswrapper[4939]: E0318 16:30:17.519414 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397\": container with ID starting with b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397 not found: ID does not exist" containerID="b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.519454 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397"} err="failed to get container status \"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397\": rpc error: code = NotFound desc = could not find container \"b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397\": container with ID starting with b116f19d8fa84c3c9c6e79a84be0b0f0eb63a8e6e8b0168abaaf6f69e18f4397 not found: ID does not exist" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.519482 4939 scope.go:117] "RemoveContainer" containerID="a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992" Mar 18 16:30:17 crc kubenswrapper[4939]: E0318 16:30:17.519819 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992\": container with ID starting with a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992 not found: ID does not exist" containerID="a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.519843 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992"} err="failed to get container status \"a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992\": rpc error: code = NotFound desc = could not find container \"a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992\": container with ID starting with a0aedce231982f6e58cfec081d1ea752e452bbac06c942d4ab6f63f98c03c992 not found: ID does not exist" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.519857 4939 scope.go:117] "RemoveContainer" containerID="bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e" Mar 18 16:30:17 crc kubenswrapper[4939]: E0318 16:30:17.520058 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e\": container with ID starting with bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e not found: ID does not exist" containerID="bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e" Mar 18 16:30:17 crc kubenswrapper[4939]: I0318 16:30:17.520074 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e"} err="failed to get container status \"bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e\": rpc error: code = NotFound desc = could not find container \"bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e\": container with ID starting with bff99391699cf8445bfb50647b4b424556b3ae3d7d61621fa506612473aa265e not found: ID does not exist" Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.061197 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2c0277e-ec35-44d4-9ee8-073d6a82e377" (UID: "b2c0277e-ec35-44d4-9ee8-073d6a82e377"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.131998 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c0277e-ec35-44d4-9ee8-073d6a82e377-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.371649 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.385990 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4s8bp"] Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.451217 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerStarted","Data":"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e"} Mar 18 16:30:18 crc kubenswrapper[4939]: I0318 16:30:18.482344 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wq6xx" podStartSLOduration=2.734769488 podStartE2EDuration="13.482320737s" podCreationTimestamp="2026-03-18 16:30:05 +0000 UTC" firstStartedPulling="2026-03-18 16:30:07.342625312 +0000 UTC m=+3171.941812933" lastFinishedPulling="2026-03-18 16:30:18.090176561 +0000 UTC m=+3182.689364182" observedRunningTime="2026-03-18 16:30:18.479428565 +0000 UTC m=+3183.078616186" watchObservedRunningTime="2026-03-18 16:30:18.482320737 +0000 UTC m=+3183.081508368" Mar 18 16:30:19 crc kubenswrapper[4939]: I0318 16:30:19.134249 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:30:19 crc kubenswrapper[4939]: E0318 16:30:19.134489 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:20 crc kubenswrapper[4939]: I0318 16:30:20.140689 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" path="/var/lib/kubelet/pods/b2c0277e-ec35-44d4-9ee8-073d6a82e377/volumes" Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.349683 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.350017 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.390023 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.553870 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.613678 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.661516 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:30:26 crc kubenswrapper[4939]: I0318 16:30:26.661775 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7ktrn" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="registry-server" containerID="cri-o://09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a" gracePeriod=2 Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.093429 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.161402 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities\") pod \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.161453 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content\") pod \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.161482 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbbgc\" (UniqueName: \"kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc\") pod \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\" (UID: \"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b\") " Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.162867 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities" (OuterVolumeSpecName: "utilities") pod "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" (UID: "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.172731 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc" (OuterVolumeSpecName: "kube-api-access-xbbgc") pod "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" (UID: "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b"). InnerVolumeSpecName "kube-api-access-xbbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.262829 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.262869 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbbgc\" (UniqueName: \"kubernetes.io/projected/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-kube-api-access-xbbgc\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.301739 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" (UID: "26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.363998 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.513691 4939 generic.go:334] "Generic (PLEG): container finished" podID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerID="09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a" exitCode=0 Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.513747 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ktrn" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.513805 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerDied","Data":"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a"} Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.513831 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ktrn" event={"ID":"26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b","Type":"ContainerDied","Data":"9e8b20101fa462f65408b3aedf8835a3a356ed350c5b73f67319c022b9831cc4"} Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.513847 4939 scope.go:117] "RemoveContainer" containerID="09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.536091 4939 scope.go:117] "RemoveContainer" containerID="6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.557592 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.568784 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7ktrn"] Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.573666 4939 scope.go:117] "RemoveContainer" containerID="eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.599644 4939 scope.go:117] "RemoveContainer" containerID="09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a" Mar 18 16:30:27 crc kubenswrapper[4939]: E0318 16:30:27.603638 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a\": container with ID starting with 09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a not found: ID does not exist" containerID="09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.603680 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a"} err="failed to get container status \"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a\": rpc error: code = NotFound desc = could not find container \"09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a\": container with ID starting with 09ed0a082f757372d98ac660077cf55194bf3e19daa9f0939fb4492b75122a6a not found: ID does not exist" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.603704 4939 scope.go:117] "RemoveContainer" containerID="6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e" Mar 18 16:30:27 crc kubenswrapper[4939]: E0318 16:30:27.607632 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e\": container with ID starting with 6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e not found: ID does not exist" containerID="6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.607670 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e"} err="failed to get container status \"6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e\": rpc error: code = NotFound desc = could not find container \"6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e\": container with ID starting with 6e02450c9af15a6afc765c1a704e2a63e3d5cd5d6fca7b6e1840dd424cfb098e not found: ID does not exist" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.607696 4939 scope.go:117] "RemoveContainer" containerID="eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237" Mar 18 16:30:27 crc kubenswrapper[4939]: E0318 16:30:27.612740 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237\": container with ID starting with eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237 not found: ID does not exist" containerID="eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237" Mar 18 16:30:27 crc kubenswrapper[4939]: I0318 16:30:27.612778 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237"} err="failed to get container status \"eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237\": rpc error: code = NotFound desc = could not find container \"eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237\": container with ID starting with eaf7a022f83822379a7448d84dcd8044d3b36ffd1dc6b026cb536cb9c0ae4237 not found: ID does not exist" Mar 18 16:30:28 crc kubenswrapper[4939]: I0318 16:30:28.142012 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" path="/var/lib/kubelet/pods/26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b/volumes" Mar 18 16:30:34 crc kubenswrapper[4939]: I0318 16:30:34.132972 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:30:34 crc kubenswrapper[4939]: E0318 16:30:34.133940 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:45 crc kubenswrapper[4939]: I0318 16:30:45.134055 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:30:45 crc kubenswrapper[4939]: E0318 16:30:45.134890 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:56 crc kubenswrapper[4939]: I0318 16:30:56.141140 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:30:56 crc kubenswrapper[4939]: E0318 16:30:56.143816 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:30:58 crc kubenswrapper[4939]: I0318 16:30:58.432576 4939 scope.go:117] "RemoveContainer" containerID="f588d6951f7649ae8a119e35d427a2cf8840360e83fcde7cefdbd2455cb7316c" Mar 18 16:30:58 crc kubenswrapper[4939]: I0318 16:30:58.458304 4939 scope.go:117] "RemoveContainer" containerID="48e5a58761095e5a2f8871c8fd2fa0bdf379c1e833d624ec60aff75d444ddb1d" Mar 18 16:31:09 crc kubenswrapper[4939]: I0318 16:31:09.133643 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:31:09 crc kubenswrapper[4939]: E0318 16:31:09.134693 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:31:21 crc kubenswrapper[4939]: I0318 16:31:21.133027 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:31:21 crc kubenswrapper[4939]: E0318 16:31:21.134345 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:31:32 crc kubenswrapper[4939]: I0318 16:31:32.133224 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:31:32 crc kubenswrapper[4939]: E0318 16:31:32.133749 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:31:46 crc kubenswrapper[4939]: I0318 16:31:46.149539 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:31:46 crc kubenswrapper[4939]: E0318 16:31:46.150413 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.134538 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.135574 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.200866 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564192-jcjjt"] Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201276 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201294 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201308 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201315 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="extract-utilities" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201328 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201335 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201346 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201351 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="extract-content" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201367 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201373 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: E0318 16:32:00.201387 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201392 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201565 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b347a9-d93c-4eaa-9aad-3ccb05c3ab9b" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.201581 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c0277e-ec35-44d4-9ee8-073d6a82e377" containerName="registry-server" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.202131 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.203965 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.204098 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.212884 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-jcjjt"] Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.213332 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.323092 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt\") pod \"auto-csr-approver-29564192-jcjjt\" (UID: \"d6e38eb7-50d0-408d-a6a5-62342027ab1c\") " pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.424042 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt\") pod \"auto-csr-approver-29564192-jcjjt\" (UID: \"d6e38eb7-50d0-408d-a6a5-62342027ab1c\") " pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.443251 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt\") pod \"auto-csr-approver-29564192-jcjjt\" (UID: \"d6e38eb7-50d0-408d-a6a5-62342027ab1c\") " pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.528062 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.937737 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-jcjjt"] Mar 18 16:32:00 crc kubenswrapper[4939]: I0318 16:32:00.946835 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:32:01 crc kubenswrapper[4939]: I0318 16:32:01.325967 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" event={"ID":"d6e38eb7-50d0-408d-a6a5-62342027ab1c","Type":"ContainerStarted","Data":"7c47ab392f43d81f3d6fdeddb2cb49c6e9381eec0aedf50a6aece002e63b249a"} Mar 18 16:32:02 crc kubenswrapper[4939]: I0318 16:32:02.333725 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" event={"ID":"d6e38eb7-50d0-408d-a6a5-62342027ab1c","Type":"ContainerStarted","Data":"d1ae871fba1529f59ed865ace3e7738387efc0558e4810fc8c5f87940097207c"} Mar 18 16:32:02 crc kubenswrapper[4939]: I0318 16:32:02.358343 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" podStartSLOduration=1.271168316 podStartE2EDuration="2.358325708s" podCreationTimestamp="2026-03-18 16:32:00 +0000 UTC" firstStartedPulling="2026-03-18 16:32:00.946636799 +0000 UTC m=+3285.545824420" lastFinishedPulling="2026-03-18 16:32:02.033794161 +0000 UTC m=+3286.632981812" observedRunningTime="2026-03-18 16:32:02.352741339 +0000 UTC m=+3286.951928960" watchObservedRunningTime="2026-03-18 16:32:02.358325708 +0000 UTC m=+3286.957513329" Mar 18 16:32:03 crc kubenswrapper[4939]: I0318 16:32:03.347277 4939 generic.go:334] "Generic (PLEG): container finished" podID="d6e38eb7-50d0-408d-a6a5-62342027ab1c" containerID="d1ae871fba1529f59ed865ace3e7738387efc0558e4810fc8c5f87940097207c" exitCode=0 Mar 18 16:32:03 crc kubenswrapper[4939]: I0318 16:32:03.347984 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" event={"ID":"d6e38eb7-50d0-408d-a6a5-62342027ab1c","Type":"ContainerDied","Data":"d1ae871fba1529f59ed865ace3e7738387efc0558e4810fc8c5f87940097207c"} Mar 18 16:32:04 crc kubenswrapper[4939]: I0318 16:32:04.619865 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:04 crc kubenswrapper[4939]: I0318 16:32:04.788182 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt\") pod \"d6e38eb7-50d0-408d-a6a5-62342027ab1c\" (UID: \"d6e38eb7-50d0-408d-a6a5-62342027ab1c\") " Mar 18 16:32:04 crc kubenswrapper[4939]: I0318 16:32:04.795803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt" (OuterVolumeSpecName: "kube-api-access-fltmt") pod "d6e38eb7-50d0-408d-a6a5-62342027ab1c" (UID: "d6e38eb7-50d0-408d-a6a5-62342027ab1c"). InnerVolumeSpecName "kube-api-access-fltmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:32:04 crc kubenswrapper[4939]: I0318 16:32:04.889594 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/d6e38eb7-50d0-408d-a6a5-62342027ab1c-kube-api-access-fltmt\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:05 crc kubenswrapper[4939]: I0318 16:32:05.366959 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" event={"ID":"d6e38eb7-50d0-408d-a6a5-62342027ab1c","Type":"ContainerDied","Data":"7c47ab392f43d81f3d6fdeddb2cb49c6e9381eec0aedf50a6aece002e63b249a"} Mar 18 16:32:05 crc kubenswrapper[4939]: I0318 16:32:05.366993 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c47ab392f43d81f3d6fdeddb2cb49c6e9381eec0aedf50a6aece002e63b249a" Mar 18 16:32:05 crc kubenswrapper[4939]: I0318 16:32:05.367226 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-jcjjt" Mar 18 16:32:05 crc kubenswrapper[4939]: I0318 16:32:05.430810 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-rb2qt"] Mar 18 16:32:05 crc kubenswrapper[4939]: I0318 16:32:05.436394 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-rb2qt"] Mar 18 16:32:06 crc kubenswrapper[4939]: I0318 16:32:06.148828 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2065de30-02ef-4491-8bf7-6f47245e108d" path="/var/lib/kubelet/pods/2065de30-02ef-4491-8bf7-6f47245e108d/volumes" Mar 18 16:32:12 crc kubenswrapper[4939]: I0318 16:32:12.133480 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:32:12 crc kubenswrapper[4939]: E0318 16:32:12.134360 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:23 crc kubenswrapper[4939]: I0318 16:32:23.133918 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:32:23 crc kubenswrapper[4939]: E0318 16:32:23.136124 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:36 crc kubenswrapper[4939]: I0318 16:32:36.136593 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:32:36 crc kubenswrapper[4939]: E0318 16:32:36.137240 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:51 crc kubenswrapper[4939]: I0318 16:32:51.132783 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:32:51 crc kubenswrapper[4939]: E0318 16:32:51.133634 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:32:58 crc kubenswrapper[4939]: I0318 16:32:58.597206 4939 scope.go:117] "RemoveContainer" containerID="b231b6c490bc894ed2afb15e35926299e2b250c7c4fa7fdf371ce58d65c6c825" Mar 18 16:33:03 crc kubenswrapper[4939]: I0318 16:33:03.133252 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:33:03 crc kubenswrapper[4939]: E0318 16:33:03.133934 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:33:14 crc kubenswrapper[4939]: I0318 16:33:14.134087 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:33:14 crc kubenswrapper[4939]: E0318 16:33:14.135196 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:33:28 crc kubenswrapper[4939]: I0318 16:33:28.133839 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:33:28 crc kubenswrapper[4939]: E0318 16:33:28.135873 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:33:40 crc kubenswrapper[4939]: I0318 16:33:40.133466 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:33:40 crc kubenswrapper[4939]: E0318 16:33:40.134196 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:33:55 crc kubenswrapper[4939]: I0318 16:33:55.133702 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:33:56 crc kubenswrapper[4939]: I0318 16:33:56.187395 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e"} Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.143298 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fgjcr"] Mar 18 16:34:00 crc kubenswrapper[4939]: E0318 16:34:00.144222 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e38eb7-50d0-408d-a6a5-62342027ab1c" containerName="oc" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.144245 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e38eb7-50d0-408d-a6a5-62342027ab1c" containerName="oc" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.144571 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e38eb7-50d0-408d-a6a5-62342027ab1c" containerName="oc" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.145291 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.149994 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.150084 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.150487 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.165868 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fgjcr"] Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.291276 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cx7\" (UniqueName: \"kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7\") pod \"auto-csr-approver-29564194-fgjcr\" (UID: \"01267d3e-0916-4ed8-8741-b5bb5213a282\") " pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.392424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cx7\" (UniqueName: \"kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7\") pod \"auto-csr-approver-29564194-fgjcr\" (UID: \"01267d3e-0916-4ed8-8741-b5bb5213a282\") " pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.417092 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cx7\" (UniqueName: \"kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7\") pod \"auto-csr-approver-29564194-fgjcr\" (UID: \"01267d3e-0916-4ed8-8741-b5bb5213a282\") " pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.469921 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:00 crc kubenswrapper[4939]: I0318 16:34:00.895143 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fgjcr"] Mar 18 16:34:01 crc kubenswrapper[4939]: I0318 16:34:01.233216 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" event={"ID":"01267d3e-0916-4ed8-8741-b5bb5213a282","Type":"ContainerStarted","Data":"83ce3502fb593c9b7eda241567b69b4d88896f086cb4c5bdd126f80f3e72c6fa"} Mar 18 16:34:03 crc kubenswrapper[4939]: I0318 16:34:03.259362 4939 generic.go:334] "Generic (PLEG): container finished" podID="01267d3e-0916-4ed8-8741-b5bb5213a282" containerID="2bbe88202b86905d14190da2ebbc097d458f94b020dfa498d5e23e7362f82ddb" exitCode=0 Mar 18 16:34:03 crc kubenswrapper[4939]: I0318 16:34:03.259471 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" event={"ID":"01267d3e-0916-4ed8-8741-b5bb5213a282","Type":"ContainerDied","Data":"2bbe88202b86905d14190da2ebbc097d458f94b020dfa498d5e23e7362f82ddb"} Mar 18 16:34:04 crc kubenswrapper[4939]: I0318 16:34:04.595434 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:04 crc kubenswrapper[4939]: I0318 16:34:04.753592 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56cx7\" (UniqueName: \"kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7\") pod \"01267d3e-0916-4ed8-8741-b5bb5213a282\" (UID: \"01267d3e-0916-4ed8-8741-b5bb5213a282\") " Mar 18 16:34:04 crc kubenswrapper[4939]: I0318 16:34:04.760129 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7" (OuterVolumeSpecName: "kube-api-access-56cx7") pod "01267d3e-0916-4ed8-8741-b5bb5213a282" (UID: "01267d3e-0916-4ed8-8741-b5bb5213a282"). InnerVolumeSpecName "kube-api-access-56cx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:34:04 crc kubenswrapper[4939]: I0318 16:34:04.854769 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56cx7\" (UniqueName: \"kubernetes.io/projected/01267d3e-0916-4ed8-8741-b5bb5213a282-kube-api-access-56cx7\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:05 crc kubenswrapper[4939]: I0318 16:34:05.274886 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" event={"ID":"01267d3e-0916-4ed8-8741-b5bb5213a282","Type":"ContainerDied","Data":"83ce3502fb593c9b7eda241567b69b4d88896f086cb4c5bdd126f80f3e72c6fa"} Mar 18 16:34:05 crc kubenswrapper[4939]: I0318 16:34:05.275246 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83ce3502fb593c9b7eda241567b69b4d88896f086cb4c5bdd126f80f3e72c6fa" Mar 18 16:34:05 crc kubenswrapper[4939]: I0318 16:34:05.275382 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fgjcr" Mar 18 16:34:05 crc kubenswrapper[4939]: I0318 16:34:05.663546 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-97xxb"] Mar 18 16:34:05 crc kubenswrapper[4939]: I0318 16:34:05.668622 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-97xxb"] Mar 18 16:34:06 crc kubenswrapper[4939]: I0318 16:34:06.140718 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bbcaade-e113-4477-8e86-8e38c7a665ed" path="/var/lib/kubelet/pods/6bbcaade-e113-4477-8e86-8e38c7a665ed/volumes" Mar 18 16:34:58 crc kubenswrapper[4939]: I0318 16:34:58.675695 4939 scope.go:117] "RemoveContainer" containerID="4baf30a34baf75f04ad855951b86916ea04daa07d6b27d108cb89a7c22ec4a3e" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.340099 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:37 crc kubenswrapper[4939]: E0318 16:35:37.341200 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01267d3e-0916-4ed8-8741-b5bb5213a282" containerName="oc" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.341223 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="01267d3e-0916-4ed8-8741-b5bb5213a282" containerName="oc" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.341480 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="01267d3e-0916-4ed8-8741-b5bb5213a282" containerName="oc" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.343281 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.369654 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.497802 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8n5\" (UniqueName: \"kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.498219 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.498325 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.600102 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.600237 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8n5\" (UniqueName: \"kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.600276 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.600736 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.600901 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.623415 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8n5\" (UniqueName: \"kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5\") pod \"community-operators-48wzm\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:37 crc kubenswrapper[4939]: I0318 16:35:37.698897 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.270793 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.727404 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.728829 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.739398 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.855903 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjcn\" (UniqueName: \"kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.855962 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.856140 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.957139 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjcn\" (UniqueName: \"kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.957240 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.957339 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.957986 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.957998 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.984046 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjcn\" (UniqueName: \"kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn\") pod \"redhat-marketplace-fwjjv\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.992613 4939 generic.go:334] "Generic (PLEG): container finished" podID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerID="64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054" exitCode=0 Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.992669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerDied","Data":"64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054"} Mar 18 16:35:38 crc kubenswrapper[4939]: I0318 16:35:38.992701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerStarted","Data":"fd1e7c930092c144088949f937530a1fea91808a950365015656c1927dc02ae3"} Mar 18 16:35:39 crc kubenswrapper[4939]: I0318 16:35:39.047037 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:39 crc kubenswrapper[4939]: I0318 16:35:39.373773 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:40 crc kubenswrapper[4939]: I0318 16:35:40.002793 4939 generic.go:334] "Generic (PLEG): container finished" podID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerID="387ca7abbf7a62b89819748d448ac0b3a440a93aace3b12125c75d65b6d7814b" exitCode=0 Mar 18 16:35:40 crc kubenswrapper[4939]: I0318 16:35:40.002976 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerDied","Data":"387ca7abbf7a62b89819748d448ac0b3a440a93aace3b12125c75d65b6d7814b"} Mar 18 16:35:40 crc kubenswrapper[4939]: I0318 16:35:40.003115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerStarted","Data":"ac50f5a3965874cc749915e580f3b09e45bfef7ad48de7756e17b5cc65c95c34"} Mar 18 16:35:41 crc kubenswrapper[4939]: I0318 16:35:41.012575 4939 generic.go:334] "Generic (PLEG): container finished" podID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerID="73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae" exitCode=0 Mar 18 16:35:41 crc kubenswrapper[4939]: I0318 16:35:41.012658 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerDied","Data":"73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae"} Mar 18 16:35:42 crc kubenswrapper[4939]: I0318 16:35:42.023714 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerStarted","Data":"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325"} Mar 18 16:35:42 crc kubenswrapper[4939]: I0318 16:35:42.026038 4939 generic.go:334] "Generic (PLEG): container finished" podID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerID="0c56c8893c6c16a2ba2f8852c1af75be619fc9ae478cfdb765c8540cdc25a292" exitCode=0 Mar 18 16:35:42 crc kubenswrapper[4939]: I0318 16:35:42.026115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerDied","Data":"0c56c8893c6c16a2ba2f8852c1af75be619fc9ae478cfdb765c8540cdc25a292"} Mar 18 16:35:42 crc kubenswrapper[4939]: I0318 16:35:42.053915 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-48wzm" podStartSLOduration=2.6055458 podStartE2EDuration="5.053893366s" podCreationTimestamp="2026-03-18 16:35:37 +0000 UTC" firstStartedPulling="2026-03-18 16:35:38.995275456 +0000 UTC m=+3503.594463067" lastFinishedPulling="2026-03-18 16:35:41.443623012 +0000 UTC m=+3506.042810633" observedRunningTime="2026-03-18 16:35:42.04944401 +0000 UTC m=+3506.648631631" watchObservedRunningTime="2026-03-18 16:35:42.053893366 +0000 UTC m=+3506.653080997" Mar 18 16:35:43 crc kubenswrapper[4939]: I0318 16:35:43.034202 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerStarted","Data":"52182c59fd9189bea81fdbe27ceffc4e7711cb2c717c0bded41afb18d09fc481"} Mar 18 16:35:43 crc kubenswrapper[4939]: I0318 16:35:43.055590 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fwjjv" podStartSLOduration=2.381945085 podStartE2EDuration="5.055566759s" podCreationTimestamp="2026-03-18 16:35:38 +0000 UTC" firstStartedPulling="2026-03-18 16:35:40.004074571 +0000 UTC m=+3504.603262192" lastFinishedPulling="2026-03-18 16:35:42.677696245 +0000 UTC m=+3507.276883866" observedRunningTime="2026-03-18 16:35:43.053226472 +0000 UTC m=+3507.652414093" watchObservedRunningTime="2026-03-18 16:35:43.055566759 +0000 UTC m=+3507.654754400" Mar 18 16:35:47 crc kubenswrapper[4939]: I0318 16:35:47.700158 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:47 crc kubenswrapper[4939]: I0318 16:35:47.700645 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:47 crc kubenswrapper[4939]: I0318 16:35:47.739918 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:48 crc kubenswrapper[4939]: I0318 16:35:48.103673 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:49 crc kubenswrapper[4939]: I0318 16:35:49.047366 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:49 crc kubenswrapper[4939]: I0318 16:35:49.047707 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:49 crc kubenswrapper[4939]: I0318 16:35:49.117109 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:49 crc kubenswrapper[4939]: I0318 16:35:49.118167 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:49 crc kubenswrapper[4939]: I0318 16:35:49.165191 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:50 crc kubenswrapper[4939]: I0318 16:35:50.078380 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-48wzm" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="registry-server" containerID="cri-o://23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325" gracePeriod=2 Mar 18 16:35:50 crc kubenswrapper[4939]: I0318 16:35:50.958484 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.086947 4939 generic.go:334] "Generic (PLEG): container finished" podID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerID="23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325" exitCode=0 Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.087003 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerDied","Data":"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325"} Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.087050 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48wzm" event={"ID":"5ffd9ca6-36a2-4265-a1f9-51a3d593981f","Type":"ContainerDied","Data":"fd1e7c930092c144088949f937530a1fea91808a950365015656c1927dc02ae3"} Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.087074 4939 scope.go:117] "RemoveContainer" containerID="23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.087015 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48wzm" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.105462 4939 scope.go:117] "RemoveContainer" containerID="73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.125341 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content\") pod \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.125688 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities\") pod \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.125960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8n5\" (UniqueName: \"kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5\") pod \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\" (UID: \"5ffd9ca6-36a2-4265-a1f9-51a3d593981f\") " Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.126377 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities" (OuterVolumeSpecName: "utilities") pod "5ffd9ca6-36a2-4265-a1f9-51a3d593981f" (UID: "5ffd9ca6-36a2-4265-a1f9-51a3d593981f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.126648 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.127138 4939 scope.go:117] "RemoveContainer" containerID="64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.131543 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5" (OuterVolumeSpecName: "kube-api-access-gp8n5") pod "5ffd9ca6-36a2-4265-a1f9-51a3d593981f" (UID: "5ffd9ca6-36a2-4265-a1f9-51a3d593981f"). InnerVolumeSpecName "kube-api-access-gp8n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.179074 4939 scope.go:117] "RemoveContainer" containerID="23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325" Mar 18 16:35:51 crc kubenswrapper[4939]: E0318 16:35:51.180572 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325\": container with ID starting with 23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325 not found: ID does not exist" containerID="23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.180697 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325"} err="failed to get container status \"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325\": rpc error: code = NotFound desc = could not find container \"23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325\": container with ID starting with 23d4fcdb83cedefb9c9e41949848d16fc99855cc7e67d1f7a04e80422226f325 not found: ID does not exist" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.180775 4939 scope.go:117] "RemoveContainer" containerID="73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae" Mar 18 16:35:51 crc kubenswrapper[4939]: E0318 16:35:51.181370 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae\": container with ID starting with 73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae not found: ID does not exist" containerID="73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.181418 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae"} err="failed to get container status \"73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae\": rpc error: code = NotFound desc = could not find container \"73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae\": container with ID starting with 73c03ed9301bfe0784e2c1966f55f508fa7e10287477dcfe530fb096904b04ae not found: ID does not exist" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.181446 4939 scope.go:117] "RemoveContainer" containerID="64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054" Mar 18 16:35:51 crc kubenswrapper[4939]: E0318 16:35:51.181889 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054\": container with ID starting with 64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054 not found: ID does not exist" containerID="64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.181924 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054"} err="failed to get container status \"64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054\": rpc error: code = NotFound desc = could not find container \"64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054\": container with ID starting with 64e8b8e04abbb40d95d913a57a6ff06d8e68e62b486693168537d4e08d6f3054 not found: ID does not exist" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.229137 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8n5\" (UniqueName: \"kubernetes.io/projected/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-kube-api-access-gp8n5\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.520494 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.521417 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fwjjv" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="registry-server" containerID="cri-o://52182c59fd9189bea81fdbe27ceffc4e7711cb2c717c0bded41afb18d09fc481" gracePeriod=2 Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.528361 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ffd9ca6-36a2-4265-a1f9-51a3d593981f" (UID: "5ffd9ca6-36a2-4265-a1f9-51a3d593981f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.534454 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffd9ca6-36a2-4265-a1f9-51a3d593981f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.721250 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:51 crc kubenswrapper[4939]: I0318 16:35:51.728419 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-48wzm"] Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.097276 4939 generic.go:334] "Generic (PLEG): container finished" podID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerID="52182c59fd9189bea81fdbe27ceffc4e7711cb2c717c0bded41afb18d09fc481" exitCode=0 Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.097339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerDied","Data":"52182c59fd9189bea81fdbe27ceffc4e7711cb2c717c0bded41afb18d09fc481"} Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.143062 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" path="/var/lib/kubelet/pods/5ffd9ca6-36a2-4265-a1f9-51a3d593981f/volumes" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.391733 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.546648 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vjcn\" (UniqueName: \"kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn\") pod \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.546997 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities\") pod \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.547114 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content\") pod \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\" (UID: \"dcaac3dc-3ea8-484f-b758-9ee23263c7bd\") " Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.547926 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities" (OuterVolumeSpecName: "utilities") pod "dcaac3dc-3ea8-484f-b758-9ee23263c7bd" (UID: "dcaac3dc-3ea8-484f-b758-9ee23263c7bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.551139 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn" (OuterVolumeSpecName: "kube-api-access-5vjcn") pod "dcaac3dc-3ea8-484f-b758-9ee23263c7bd" (UID: "dcaac3dc-3ea8-484f-b758-9ee23263c7bd"). InnerVolumeSpecName "kube-api-access-5vjcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.573583 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcaac3dc-3ea8-484f-b758-9ee23263c7bd" (UID: "dcaac3dc-3ea8-484f-b758-9ee23263c7bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.649194 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vjcn\" (UniqueName: \"kubernetes.io/projected/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-kube-api-access-5vjcn\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.649617 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:52 crc kubenswrapper[4939]: I0318 16:35:52.649741 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcaac3dc-3ea8-484f-b758-9ee23263c7bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.106452 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwjjv" event={"ID":"dcaac3dc-3ea8-484f-b758-9ee23263c7bd","Type":"ContainerDied","Data":"ac50f5a3965874cc749915e580f3b09e45bfef7ad48de7756e17b5cc65c95c34"} Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.106787 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwjjv" Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.106809 4939 scope.go:117] "RemoveContainer" containerID="52182c59fd9189bea81fdbe27ceffc4e7711cb2c717c0bded41afb18d09fc481" Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.140807 4939 scope.go:117] "RemoveContainer" containerID="0c56c8893c6c16a2ba2f8852c1af75be619fc9ae478cfdb765c8540cdc25a292" Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.149751 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.156332 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwjjv"] Mar 18 16:35:53 crc kubenswrapper[4939]: I0318 16:35:53.170430 4939 scope.go:117] "RemoveContainer" containerID="387ca7abbf7a62b89819748d448ac0b3a440a93aace3b12125c75d65b6d7814b" Mar 18 16:35:54 crc kubenswrapper[4939]: I0318 16:35:54.141988 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" path="/var/lib/kubelet/pods/dcaac3dc-3ea8-484f-b758-9ee23263c7bd/volumes" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.142629 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7nhz2"] Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143395 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143410 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143422 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143467 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143481 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143488 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143519 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143527 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143540 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143547 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4939]: E0318 16:36:00.143569 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143576 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143735 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffd9ca6-36a2-4265-a1f9-51a3d593981f" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.143756 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcaac3dc-3ea8-484f-b758-9ee23263c7bd" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.144329 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.146107 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.146271 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.146288 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.151930 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7nhz2"] Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.155054 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64vz\" (UniqueName: \"kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz\") pod \"auto-csr-approver-29564196-7nhz2\" (UID: \"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca\") " pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.256445 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64vz\" (UniqueName: \"kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz\") pod \"auto-csr-approver-29564196-7nhz2\" (UID: \"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca\") " pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.277182 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64vz\" (UniqueName: \"kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz\") pod \"auto-csr-approver-29564196-7nhz2\" (UID: \"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca\") " pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.464535 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:00 crc kubenswrapper[4939]: I0318 16:36:00.888108 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7nhz2"] Mar 18 16:36:01 crc kubenswrapper[4939]: I0318 16:36:01.174908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" event={"ID":"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca","Type":"ContainerStarted","Data":"4865df38b8086c3fd7135738e29b444bc634dfc0dba549b51f1f596509ee01ea"} Mar 18 16:36:03 crc kubenswrapper[4939]: I0318 16:36:03.189013 4939 generic.go:334] "Generic (PLEG): container finished" podID="b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" containerID="9f855cc737344d45cc09f41da5e76174d1b82088cdde8a26e8c6f6f2a264e802" exitCode=0 Mar 18 16:36:03 crc kubenswrapper[4939]: I0318 16:36:03.189094 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" event={"ID":"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca","Type":"ContainerDied","Data":"9f855cc737344d45cc09f41da5e76174d1b82088cdde8a26e8c6f6f2a264e802"} Mar 18 16:36:04 crc kubenswrapper[4939]: I0318 16:36:04.446872 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:04 crc kubenswrapper[4939]: I0318 16:36:04.617855 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64vz\" (UniqueName: \"kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz\") pod \"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca\" (UID: \"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca\") " Mar 18 16:36:04 crc kubenswrapper[4939]: I0318 16:36:04.624030 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz" (OuterVolumeSpecName: "kube-api-access-z64vz") pod "b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" (UID: "b0efe6d9-89a9-4218-bf87-7b2bcc2599ca"). InnerVolumeSpecName "kube-api-access-z64vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:36:04 crc kubenswrapper[4939]: I0318 16:36:04.720040 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64vz\" (UniqueName: \"kubernetes.io/projected/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca-kube-api-access-z64vz\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:05 crc kubenswrapper[4939]: I0318 16:36:05.204547 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" event={"ID":"b0efe6d9-89a9-4218-bf87-7b2bcc2599ca","Type":"ContainerDied","Data":"4865df38b8086c3fd7135738e29b444bc634dfc0dba549b51f1f596509ee01ea"} Mar 18 16:36:05 crc kubenswrapper[4939]: I0318 16:36:05.204915 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4865df38b8086c3fd7135738e29b444bc634dfc0dba549b51f1f596509ee01ea" Mar 18 16:36:05 crc kubenswrapper[4939]: I0318 16:36:05.204582 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-7nhz2" Mar 18 16:36:05 crc kubenswrapper[4939]: I0318 16:36:05.516800 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-9ssdk"] Mar 18 16:36:05 crc kubenswrapper[4939]: I0318 16:36:05.523127 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-9ssdk"] Mar 18 16:36:06 crc kubenswrapper[4939]: I0318 16:36:06.143675 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c7d227-304b-4e86-92ab-ee9764b97d49" path="/var/lib/kubelet/pods/91c7d227-304b-4e86-92ab-ee9764b97d49/volumes" Mar 18 16:36:23 crc kubenswrapper[4939]: I0318 16:36:23.687397 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:36:23 crc kubenswrapper[4939]: I0318 16:36:23.687959 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:36:53 crc kubenswrapper[4939]: I0318 16:36:53.687958 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:36:53 crc kubenswrapper[4939]: I0318 16:36:53.688463 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:36:58 crc kubenswrapper[4939]: I0318 16:36:58.789772 4939 scope.go:117] "RemoveContainer" containerID="68cbb6e9c16a361ce518454c1c20375b078d397fd470332d620606d6c1f8ef28" Mar 18 16:37:23 crc kubenswrapper[4939]: I0318 16:37:23.687857 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:37:23 crc kubenswrapper[4939]: I0318 16:37:23.688404 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:37:23 crc kubenswrapper[4939]: I0318 16:37:23.688454 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:37:23 crc kubenswrapper[4939]: I0318 16:37:23.689126 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:37:23 crc kubenswrapper[4939]: I0318 16:37:23.689185 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e" gracePeriod=600 Mar 18 16:37:24 crc kubenswrapper[4939]: I0318 16:37:24.776748 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e" exitCode=0 Mar 18 16:37:24 crc kubenswrapper[4939]: I0318 16:37:24.776828 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e"} Mar 18 16:37:24 crc kubenswrapper[4939]: I0318 16:37:24.777039 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe"} Mar 18 16:37:24 crc kubenswrapper[4939]: I0318 16:37:24.777064 4939 scope.go:117] "RemoveContainer" containerID="0bbf1f0acff1ebaf1f8d4348b0ff53a33f85da776af6032e4007a3f0614da9c7" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.158120 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564198-q5wmm"] Mar 18 16:38:00 crc kubenswrapper[4939]: E0318 16:38:00.162579 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" containerName="oc" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.162824 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" containerName="oc" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.163609 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" containerName="oc" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.164865 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.169021 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.169270 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.169422 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.179088 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-q5wmm"] Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.283024 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsxf\" (UniqueName: \"kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf\") pod \"auto-csr-approver-29564198-q5wmm\" (UID: \"32d65546-9488-4880-812e-c1352c6748a7\") " pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.385324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsxf\" (UniqueName: \"kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf\") pod \"auto-csr-approver-29564198-q5wmm\" (UID: \"32d65546-9488-4880-812e-c1352c6748a7\") " pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.411831 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsxf\" (UniqueName: \"kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf\") pod \"auto-csr-approver-29564198-q5wmm\" (UID: \"32d65546-9488-4880-812e-c1352c6748a7\") " pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.493441 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.946270 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-q5wmm"] Mar 18 16:38:00 crc kubenswrapper[4939]: I0318 16:38:00.954147 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:38:01 crc kubenswrapper[4939]: I0318 16:38:01.073772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" event={"ID":"32d65546-9488-4880-812e-c1352c6748a7","Type":"ContainerStarted","Data":"f0cc41621f466ec3fc8f041a8acb8182bdb6a00c9032583cc67f2dc8ee768ce6"} Mar 18 16:38:03 crc kubenswrapper[4939]: I0318 16:38:03.088361 4939 generic.go:334] "Generic (PLEG): container finished" podID="32d65546-9488-4880-812e-c1352c6748a7" containerID="1c4ba5b2df9c829d759d83f77c16f0877b7ffa853eb4bae269e1b0256e65a379" exitCode=0 Mar 18 16:38:03 crc kubenswrapper[4939]: I0318 16:38:03.088409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" event={"ID":"32d65546-9488-4880-812e-c1352c6748a7","Type":"ContainerDied","Data":"1c4ba5b2df9c829d759d83f77c16f0877b7ffa853eb4bae269e1b0256e65a379"} Mar 18 16:38:04 crc kubenswrapper[4939]: I0318 16:38:04.410550 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:04 crc kubenswrapper[4939]: I0318 16:38:04.539904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnsxf\" (UniqueName: \"kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf\") pod \"32d65546-9488-4880-812e-c1352c6748a7\" (UID: \"32d65546-9488-4880-812e-c1352c6748a7\") " Mar 18 16:38:04 crc kubenswrapper[4939]: I0318 16:38:04.545050 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf" (OuterVolumeSpecName: "kube-api-access-jnsxf") pod "32d65546-9488-4880-812e-c1352c6748a7" (UID: "32d65546-9488-4880-812e-c1352c6748a7"). InnerVolumeSpecName "kube-api-access-jnsxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:38:04 crc kubenswrapper[4939]: I0318 16:38:04.641279 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnsxf\" (UniqueName: \"kubernetes.io/projected/32d65546-9488-4880-812e-c1352c6748a7-kube-api-access-jnsxf\") on node \"crc\" DevicePath \"\"" Mar 18 16:38:05 crc kubenswrapper[4939]: I0318 16:38:05.109006 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" event={"ID":"32d65546-9488-4880-812e-c1352c6748a7","Type":"ContainerDied","Data":"f0cc41621f466ec3fc8f041a8acb8182bdb6a00c9032583cc67f2dc8ee768ce6"} Mar 18 16:38:05 crc kubenswrapper[4939]: I0318 16:38:05.109048 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0cc41621f466ec3fc8f041a8acb8182bdb6a00c9032583cc67f2dc8ee768ce6" Mar 18 16:38:05 crc kubenswrapper[4939]: I0318 16:38:05.109119 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-q5wmm" Mar 18 16:38:05 crc kubenswrapper[4939]: I0318 16:38:05.484700 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-jcjjt"] Mar 18 16:38:05 crc kubenswrapper[4939]: I0318 16:38:05.491740 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-jcjjt"] Mar 18 16:38:06 crc kubenswrapper[4939]: I0318 16:38:06.154129 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e38eb7-50d0-408d-a6a5-62342027ab1c" path="/var/lib/kubelet/pods/d6e38eb7-50d0-408d-a6a5-62342027ab1c/volumes" Mar 18 16:38:58 crc kubenswrapper[4939]: I0318 16:38:58.875318 4939 scope.go:117] "RemoveContainer" containerID="d1ae871fba1529f59ed865ace3e7738387efc0558e4810fc8c5f87940097207c" Mar 18 16:39:53 crc kubenswrapper[4939]: I0318 16:39:53.688255 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:39:53 crc kubenswrapper[4939]: I0318 16:39:53.688956 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.172175 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564200-s5vqx"] Mar 18 16:40:00 crc kubenswrapper[4939]: E0318 16:40:00.173553 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d65546-9488-4880-812e-c1352c6748a7" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.173589 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d65546-9488-4880-812e-c1352c6748a7" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.174110 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d65546-9488-4880-812e-c1352c6748a7" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.175071 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.178821 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.179127 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.179706 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.188945 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-s5vqx"] Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.234667 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkpb\" (UniqueName: \"kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb\") pod \"auto-csr-approver-29564200-s5vqx\" (UID: \"bd72e48f-9a7a-40a3-97e2-da72baa08687\") " pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.336269 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkpb\" (UniqueName: \"kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb\") pod \"auto-csr-approver-29564200-s5vqx\" (UID: \"bd72e48f-9a7a-40a3-97e2-da72baa08687\") " pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.364162 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkpb\" (UniqueName: \"kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb\") pod \"auto-csr-approver-29564200-s5vqx\" (UID: \"bd72e48f-9a7a-40a3-97e2-da72baa08687\") " pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:00 crc kubenswrapper[4939]: I0318 16:40:00.582864 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:01 crc kubenswrapper[4939]: I0318 16:40:01.056045 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-s5vqx"] Mar 18 16:40:02 crc kubenswrapper[4939]: I0318 16:40:02.016790 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" event={"ID":"bd72e48f-9a7a-40a3-97e2-da72baa08687","Type":"ContainerStarted","Data":"6378e8ce8c846c946f38ab1f57345b990977f572a252bb4d124daea62592034a"} Mar 18 16:40:04 crc kubenswrapper[4939]: I0318 16:40:04.034721 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" event={"ID":"bd72e48f-9a7a-40a3-97e2-da72baa08687","Type":"ContainerStarted","Data":"c05ea758df78162b6562fac2173de925718d1fb6abcb6246a39263a48577c735"} Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.044036 4939 generic.go:334] "Generic (PLEG): container finished" podID="bd72e48f-9a7a-40a3-97e2-da72baa08687" containerID="c05ea758df78162b6562fac2173de925718d1fb6abcb6246a39263a48577c735" exitCode=0 Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.044122 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" event={"ID":"bd72e48f-9a7a-40a3-97e2-da72baa08687","Type":"ContainerDied","Data":"c05ea758df78162b6562fac2173de925718d1fb6abcb6246a39263a48577c735"} Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.335926 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.457231 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhkpb\" (UniqueName: \"kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb\") pod \"bd72e48f-9a7a-40a3-97e2-da72baa08687\" (UID: \"bd72e48f-9a7a-40a3-97e2-da72baa08687\") " Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.465768 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb" (OuterVolumeSpecName: "kube-api-access-vhkpb") pod "bd72e48f-9a7a-40a3-97e2-da72baa08687" (UID: "bd72e48f-9a7a-40a3-97e2-da72baa08687"). InnerVolumeSpecName "kube-api-access-vhkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:40:05 crc kubenswrapper[4939]: I0318 16:40:05.559068 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhkpb\" (UniqueName: \"kubernetes.io/projected/bd72e48f-9a7a-40a3-97e2-da72baa08687-kube-api-access-vhkpb\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:06 crc kubenswrapper[4939]: I0318 16:40:06.052988 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" event={"ID":"bd72e48f-9a7a-40a3-97e2-da72baa08687","Type":"ContainerDied","Data":"6378e8ce8c846c946f38ab1f57345b990977f572a252bb4d124daea62592034a"} Mar 18 16:40:06 crc kubenswrapper[4939]: I0318 16:40:06.053047 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-s5vqx" Mar 18 16:40:06 crc kubenswrapper[4939]: I0318 16:40:06.053067 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6378e8ce8c846c946f38ab1f57345b990977f572a252bb4d124daea62592034a" Mar 18 16:40:06 crc kubenswrapper[4939]: I0318 16:40:06.409996 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fgjcr"] Mar 18 16:40:06 crc kubenswrapper[4939]: I0318 16:40:06.415942 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fgjcr"] Mar 18 16:40:08 crc kubenswrapper[4939]: I0318 16:40:08.147237 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01267d3e-0916-4ed8-8741-b5bb5213a282" path="/var/lib/kubelet/pods/01267d3e-0916-4ed8-8741-b5bb5213a282/volumes" Mar 18 16:40:09 crc kubenswrapper[4939]: I0318 16:40:09.997728 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:09 crc kubenswrapper[4939]: E0318 16:40:09.998375 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd72e48f-9a7a-40a3-97e2-da72baa08687" containerName="oc" Mar 18 16:40:09 crc kubenswrapper[4939]: I0318 16:40:09.998392 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd72e48f-9a7a-40a3-97e2-da72baa08687" containerName="oc" Mar 18 16:40:09 crc kubenswrapper[4939]: I0318 16:40:09.998567 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd72e48f-9a7a-40a3-97e2-da72baa08687" containerName="oc" Mar 18 16:40:09 crc kubenswrapper[4939]: I0318 16:40:09.999638 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.008055 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.133423 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.133546 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnxd\" (UniqueName: \"kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.133593 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.234546 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnxd\" (UniqueName: \"kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.234622 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.234663 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.235150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.235181 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.253373 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnxd\" (UniqueName: \"kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd\") pod \"redhat-operators-wmwtf\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.320103 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:10 crc kubenswrapper[4939]: I0318 16:40:10.744532 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:11 crc kubenswrapper[4939]: I0318 16:40:11.090493 4939 generic.go:334] "Generic (PLEG): container finished" podID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerID="1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5" exitCode=0 Mar 18 16:40:11 crc kubenswrapper[4939]: I0318 16:40:11.090572 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerDied","Data":"1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5"} Mar 18 16:40:11 crc kubenswrapper[4939]: I0318 16:40:11.090810 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerStarted","Data":"d147904b2044024fc96931ed1497a578019a81f7dc4d3d6a70337075d3250a43"} Mar 18 16:40:13 crc kubenswrapper[4939]: I0318 16:40:13.114879 4939 generic.go:334] "Generic (PLEG): container finished" podID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerID="631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760" exitCode=0 Mar 18 16:40:13 crc kubenswrapper[4939]: I0318 16:40:13.115223 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerDied","Data":"631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760"} Mar 18 16:40:14 crc kubenswrapper[4939]: I0318 16:40:14.143252 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerStarted","Data":"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5"} Mar 18 16:40:14 crc kubenswrapper[4939]: I0318 16:40:14.156380 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wmwtf" podStartSLOduration=2.708471329 podStartE2EDuration="5.156362059s" podCreationTimestamp="2026-03-18 16:40:09 +0000 UTC" firstStartedPulling="2026-03-18 16:40:11.092135789 +0000 UTC m=+3775.691323410" lastFinishedPulling="2026-03-18 16:40:13.540026519 +0000 UTC m=+3778.139214140" observedRunningTime="2026-03-18 16:40:14.151528452 +0000 UTC m=+3778.750716073" watchObservedRunningTime="2026-03-18 16:40:14.156362059 +0000 UTC m=+3778.755549680" Mar 18 16:40:20 crc kubenswrapper[4939]: I0318 16:40:20.320988 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:20 crc kubenswrapper[4939]: I0318 16:40:20.321717 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:21 crc kubenswrapper[4939]: I0318 16:40:21.363828 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wmwtf" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="registry-server" probeResult="failure" output=< Mar 18 16:40:21 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 16:40:21 crc kubenswrapper[4939]: > Mar 18 16:40:23 crc kubenswrapper[4939]: I0318 16:40:23.688059 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:40:23 crc kubenswrapper[4939]: I0318 16:40:23.688439 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:40:30 crc kubenswrapper[4939]: I0318 16:40:30.383661 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:30 crc kubenswrapper[4939]: I0318 16:40:30.427137 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:30 crc kubenswrapper[4939]: I0318 16:40:30.619744 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.254523 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wmwtf" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="registry-server" containerID="cri-o://bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5" gracePeriod=2 Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.641842 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.750075 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnxd\" (UniqueName: \"kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd\") pod \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.750177 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content\") pod \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.750264 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities\") pod \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\" (UID: \"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0\") " Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.751283 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities" (OuterVolumeSpecName: "utilities") pod "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" (UID: "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.755892 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd" (OuterVolumeSpecName: "kube-api-access-rvnxd") pod "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" (UID: "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0"). InnerVolumeSpecName "kube-api-access-rvnxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.851903 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.851944 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnxd\" (UniqueName: \"kubernetes.io/projected/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-kube-api-access-rvnxd\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.879992 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" (UID: "424afc68-f0dd-4cd3-b9ef-1531d9ab60f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:40:32 crc kubenswrapper[4939]: I0318 16:40:32.953468 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.262373 4939 generic.go:334] "Generic (PLEG): container finished" podID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerID="bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5" exitCode=0 Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.262650 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerDied","Data":"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5"} Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.262675 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmwtf" event={"ID":"424afc68-f0dd-4cd3-b9ef-1531d9ab60f0","Type":"ContainerDied","Data":"d147904b2044024fc96931ed1497a578019a81f7dc4d3d6a70337075d3250a43"} Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.262691 4939 scope.go:117] "RemoveContainer" containerID="bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.262788 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmwtf" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.292397 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.293933 4939 scope.go:117] "RemoveContainer" containerID="631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.297029 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wmwtf"] Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.318810 4939 scope.go:117] "RemoveContainer" containerID="1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.337307 4939 scope.go:117] "RemoveContainer" containerID="bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5" Mar 18 16:40:33 crc kubenswrapper[4939]: E0318 16:40:33.338134 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5\": container with ID starting with bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5 not found: ID does not exist" containerID="bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.338178 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5"} err="failed to get container status \"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5\": rpc error: code = NotFound desc = could not find container \"bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5\": container with ID starting with bcf198739623b6f3af5076b3cba77b394aef679710efdddc6bf4fea75754cab5 not found: ID does not exist" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.338203 4939 scope.go:117] "RemoveContainer" containerID="631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760" Mar 18 16:40:33 crc kubenswrapper[4939]: E0318 16:40:33.338497 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760\": container with ID starting with 631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760 not found: ID does not exist" containerID="631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.338538 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760"} err="failed to get container status \"631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760\": rpc error: code = NotFound desc = could not find container \"631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760\": container with ID starting with 631d7a962106c6d8656325763f0379d355629c2a8bc6fd29208cefdc9eef8760 not found: ID does not exist" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.338559 4939 scope.go:117] "RemoveContainer" containerID="1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5" Mar 18 16:40:33 crc kubenswrapper[4939]: E0318 16:40:33.338972 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5\": container with ID starting with 1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5 not found: ID does not exist" containerID="1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5" Mar 18 16:40:33 crc kubenswrapper[4939]: I0318 16:40:33.339008 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5"} err="failed to get container status \"1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5\": rpc error: code = NotFound desc = could not find container \"1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5\": container with ID starting with 1a4f838536a7bb6ba63b3790d5ee961f8393c15bb7313600b76f0de5d613cdc5 not found: ID does not exist" Mar 18 16:40:34 crc kubenswrapper[4939]: I0318 16:40:34.147030 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" path="/var/lib/kubelet/pods/424afc68-f0dd-4cd3-b9ef-1531d9ab60f0/volumes" Mar 18 16:40:53 crc kubenswrapper[4939]: I0318 16:40:53.687446 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:40:53 crc kubenswrapper[4939]: I0318 16:40:53.688002 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:40:53 crc kubenswrapper[4939]: I0318 16:40:53.688060 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:40:53 crc kubenswrapper[4939]: I0318 16:40:53.688814 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:40:53 crc kubenswrapper[4939]: I0318 16:40:53.688867 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" gracePeriod=600 Mar 18 16:40:53 crc kubenswrapper[4939]: E0318 16:40:53.816430 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:40:54 crc kubenswrapper[4939]: I0318 16:40:54.406294 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" exitCode=0 Mar 18 16:40:54 crc kubenswrapper[4939]: I0318 16:40:54.406371 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe"} Mar 18 16:40:54 crc kubenswrapper[4939]: I0318 16:40:54.406667 4939 scope.go:117] "RemoveContainer" containerID="8c36e7a41cd93fb9a6ef202a231fd9246686f8ff40738a83d129f3e5d9da718e" Mar 18 16:40:54 crc kubenswrapper[4939]: I0318 16:40:54.407232 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:40:54 crc kubenswrapper[4939]: E0318 16:40:54.407492 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:40:58 crc kubenswrapper[4939]: I0318 16:40:58.982083 4939 scope.go:117] "RemoveContainer" containerID="2bbe88202b86905d14190da2ebbc097d458f94b020dfa498d5e23e7362f82ddb" Mar 18 16:41:08 crc kubenswrapper[4939]: I0318 16:41:08.133028 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:41:08 crc kubenswrapper[4939]: E0318 16:41:08.133800 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:41:19 crc kubenswrapper[4939]: I0318 16:41:19.134006 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:41:19 crc kubenswrapper[4939]: E0318 16:41:19.135086 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:41:34 crc kubenswrapper[4939]: I0318 16:41:34.133288 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:41:34 crc kubenswrapper[4939]: E0318 16:41:34.133996 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:41:46 crc kubenswrapper[4939]: I0318 16:41:46.142439 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:41:46 crc kubenswrapper[4939]: E0318 16:41:46.143334 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:41:59 crc kubenswrapper[4939]: I0318 16:41:59.133948 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:41:59 crc kubenswrapper[4939]: E0318 16:41:59.134913 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.151247 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564202-k4xpv"] Mar 18 16:42:00 crc kubenswrapper[4939]: E0318 16:42:00.151627 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.151649 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4939]: E0318 16:42:00.151669 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="extract-content" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.151680 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="extract-content" Mar 18 16:42:00 crc kubenswrapper[4939]: E0318 16:42:00.151717 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="extract-utilities" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.151729 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="extract-utilities" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.151954 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="424afc68-f0dd-4cd3-b9ef-1531d9ab60f0" containerName="registry-server" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.152745 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.154545 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.154798 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.155875 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.165547 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-k4xpv"] Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.220986 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hkm\" (UniqueName: \"kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm\") pod \"auto-csr-approver-29564202-k4xpv\" (UID: \"1f982d93-4356-408c-8d35-644ae609832c\") " pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.322328 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hkm\" (UniqueName: \"kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm\") pod \"auto-csr-approver-29564202-k4xpv\" (UID: \"1f982d93-4356-408c-8d35-644ae609832c\") " pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.343948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hkm\" (UniqueName: \"kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm\") pod \"auto-csr-approver-29564202-k4xpv\" (UID: \"1f982d93-4356-408c-8d35-644ae609832c\") " pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:00 crc kubenswrapper[4939]: I0318 16:42:00.505479 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:01 crc kubenswrapper[4939]: I0318 16:42:00.999807 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-k4xpv"] Mar 18 16:42:01 crc kubenswrapper[4939]: I0318 16:42:01.919877 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" event={"ID":"1f982d93-4356-408c-8d35-644ae609832c","Type":"ContainerStarted","Data":"a025bb48dfd9b89fdd7bb8c2c6d380a024cb884280962b68025e8ce5282b0887"} Mar 18 16:42:02 crc kubenswrapper[4939]: I0318 16:42:02.928681 4939 generic.go:334] "Generic (PLEG): container finished" podID="1f982d93-4356-408c-8d35-644ae609832c" containerID="70b6dd0dc0dba7007ce4033782cc647f1f33d7d05d846e2d00c3274166370f38" exitCode=0 Mar 18 16:42:02 crc kubenswrapper[4939]: I0318 16:42:02.928776 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" event={"ID":"1f982d93-4356-408c-8d35-644ae609832c","Type":"ContainerDied","Data":"70b6dd0dc0dba7007ce4033782cc647f1f33d7d05d846e2d00c3274166370f38"} Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.239109 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.281872 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98hkm\" (UniqueName: \"kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm\") pod \"1f982d93-4356-408c-8d35-644ae609832c\" (UID: \"1f982d93-4356-408c-8d35-644ae609832c\") " Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.293829 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm" (OuterVolumeSpecName: "kube-api-access-98hkm") pod "1f982d93-4356-408c-8d35-644ae609832c" (UID: "1f982d93-4356-408c-8d35-644ae609832c"). InnerVolumeSpecName "kube-api-access-98hkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.383849 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98hkm\" (UniqueName: \"kubernetes.io/projected/1f982d93-4356-408c-8d35-644ae609832c-kube-api-access-98hkm\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.948181 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" event={"ID":"1f982d93-4356-408c-8d35-644ae609832c","Type":"ContainerDied","Data":"a025bb48dfd9b89fdd7bb8c2c6d380a024cb884280962b68025e8ce5282b0887"} Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.948231 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a025bb48dfd9b89fdd7bb8c2c6d380a024cb884280962b68025e8ce5282b0887" Mar 18 16:42:04 crc kubenswrapper[4939]: I0318 16:42:04.948265 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-k4xpv" Mar 18 16:42:05 crc kubenswrapper[4939]: I0318 16:42:05.315667 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7nhz2"] Mar 18 16:42:05 crc kubenswrapper[4939]: I0318 16:42:05.323373 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-7nhz2"] Mar 18 16:42:06 crc kubenswrapper[4939]: I0318 16:42:06.145367 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0efe6d9-89a9-4218-bf87-7b2bcc2599ca" path="/var/lib/kubelet/pods/b0efe6d9-89a9-4218-bf87-7b2bcc2599ca/volumes" Mar 18 16:42:13 crc kubenswrapper[4939]: I0318 16:42:13.132562 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:42:13 crc kubenswrapper[4939]: E0318 16:42:13.133221 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:42:28 crc kubenswrapper[4939]: I0318 16:42:28.133596 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:42:28 crc kubenswrapper[4939]: E0318 16:42:28.136348 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:42:40 crc kubenswrapper[4939]: I0318 16:42:40.133586 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:42:40 crc kubenswrapper[4939]: E0318 16:42:40.134918 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:42:51 crc kubenswrapper[4939]: I0318 16:42:51.134064 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:42:51 crc kubenswrapper[4939]: E0318 16:42:51.135885 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:42:59 crc kubenswrapper[4939]: I0318 16:42:59.097551 4939 scope.go:117] "RemoveContainer" containerID="9f855cc737344d45cc09f41da5e76174d1b82088cdde8a26e8c6f6f2a264e802" Mar 18 16:43:06 crc kubenswrapper[4939]: I0318 16:43:06.136681 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:43:06 crc kubenswrapper[4939]: E0318 16:43:06.138213 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:43:20 crc kubenswrapper[4939]: I0318 16:43:20.133010 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:43:20 crc kubenswrapper[4939]: E0318 16:43:20.133608 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:43:33 crc kubenswrapper[4939]: I0318 16:43:33.133657 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:43:33 crc kubenswrapper[4939]: E0318 16:43:33.134569 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:43:44 crc kubenswrapper[4939]: I0318 16:43:44.133265 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:43:44 crc kubenswrapper[4939]: E0318 16:43:44.133876 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:43:59 crc kubenswrapper[4939]: I0318 16:43:59.133394 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:43:59 crc kubenswrapper[4939]: E0318 16:43:59.134150 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.171584 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564204-jnrpf"] Mar 18 16:44:00 crc kubenswrapper[4939]: E0318 16:44:00.172800 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f982d93-4356-408c-8d35-644ae609832c" containerName="oc" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.172831 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f982d93-4356-408c-8d35-644ae609832c" containerName="oc" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.173152 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f982d93-4356-408c-8d35-644ae609832c" containerName="oc" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.173741 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.179583 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-jnrpf"] Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.210361 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.210388 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.210424 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.263665 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfv4\" (UniqueName: \"kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4\") pod \"auto-csr-approver-29564204-jnrpf\" (UID: \"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf\") " pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.364831 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfv4\" (UniqueName: \"kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4\") pod \"auto-csr-approver-29564204-jnrpf\" (UID: \"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf\") " pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.384956 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfv4\" (UniqueName: \"kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4\") pod \"auto-csr-approver-29564204-jnrpf\" (UID: \"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf\") " pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.524432 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.955827 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-jnrpf"] Mar 18 16:44:00 crc kubenswrapper[4939]: I0318 16:44:00.966932 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:01 crc kubenswrapper[4939]: I0318 16:44:01.140073 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" event={"ID":"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf","Type":"ContainerStarted","Data":"34432d7db7456c90a0bbc9099e7defb74341757038f11995a89f399226b45e0e"} Mar 18 16:44:02 crc kubenswrapper[4939]: I0318 16:44:02.154362 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" event={"ID":"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf","Type":"ContainerStarted","Data":"b61adc5cdaf45d57a6677a675b78b1e3b5555ae80ef398f56c02a22cdb568fc0"} Mar 18 16:44:02 crc kubenswrapper[4939]: I0318 16:44:02.176709 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" podStartSLOduration=1.360827234 podStartE2EDuration="2.176689259s" podCreationTimestamp="2026-03-18 16:44:00 +0000 UTC" firstStartedPulling="2026-03-18 16:44:00.966667617 +0000 UTC m=+4005.565855238" lastFinishedPulling="2026-03-18 16:44:01.782529642 +0000 UTC m=+4006.381717263" observedRunningTime="2026-03-18 16:44:02.171302846 +0000 UTC m=+4006.770490477" watchObservedRunningTime="2026-03-18 16:44:02.176689259 +0000 UTC m=+4006.775876890" Mar 18 16:44:03 crc kubenswrapper[4939]: I0318 16:44:03.163043 4939 generic.go:334] "Generic (PLEG): container finished" podID="ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" containerID="b61adc5cdaf45d57a6677a675b78b1e3b5555ae80ef398f56c02a22cdb568fc0" exitCode=0 Mar 18 16:44:03 crc kubenswrapper[4939]: I0318 16:44:03.163396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" event={"ID":"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf","Type":"ContainerDied","Data":"b61adc5cdaf45d57a6677a675b78b1e3b5555ae80ef398f56c02a22cdb568fc0"} Mar 18 16:44:04 crc kubenswrapper[4939]: I0318 16:44:04.523328 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:04 crc kubenswrapper[4939]: I0318 16:44:04.626917 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfv4\" (UniqueName: \"kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4\") pod \"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf\" (UID: \"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf\") " Mar 18 16:44:04 crc kubenswrapper[4939]: I0318 16:44:04.635698 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4" (OuterVolumeSpecName: "kube-api-access-9dfv4") pod "ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" (UID: "ba4fe94a-9397-4a66-86ae-402dbb5bc5bf"). InnerVolumeSpecName "kube-api-access-9dfv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:44:04 crc kubenswrapper[4939]: I0318 16:44:04.728906 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfv4\" (UniqueName: \"kubernetes.io/projected/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf-kube-api-access-9dfv4\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:05 crc kubenswrapper[4939]: I0318 16:44:05.181193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" event={"ID":"ba4fe94a-9397-4a66-86ae-402dbb5bc5bf","Type":"ContainerDied","Data":"34432d7db7456c90a0bbc9099e7defb74341757038f11995a89f399226b45e0e"} Mar 18 16:44:05 crc kubenswrapper[4939]: I0318 16:44:05.181232 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34432d7db7456c90a0bbc9099e7defb74341757038f11995a89f399226b45e0e" Mar 18 16:44:05 crc kubenswrapper[4939]: I0318 16:44:05.181234 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-jnrpf" Mar 18 16:44:05 crc kubenswrapper[4939]: I0318 16:44:05.250369 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-q5wmm"] Mar 18 16:44:05 crc kubenswrapper[4939]: I0318 16:44:05.256240 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-q5wmm"] Mar 18 16:44:06 crc kubenswrapper[4939]: I0318 16:44:06.148762 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d65546-9488-4880-812e-c1352c6748a7" path="/var/lib/kubelet/pods/32d65546-9488-4880-812e-c1352c6748a7/volumes" Mar 18 16:44:12 crc kubenswrapper[4939]: I0318 16:44:12.134112 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:44:12 crc kubenswrapper[4939]: E0318 16:44:12.135194 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:44:24 crc kubenswrapper[4939]: I0318 16:44:24.134080 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:44:24 crc kubenswrapper[4939]: E0318 16:44:24.135167 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:44:38 crc kubenswrapper[4939]: I0318 16:44:38.133220 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:44:38 crc kubenswrapper[4939]: E0318 16:44:38.134272 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:44:53 crc kubenswrapper[4939]: I0318 16:44:53.134122 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:44:53 crc kubenswrapper[4939]: E0318 16:44:53.135469 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:44:59 crc kubenswrapper[4939]: I0318 16:44:59.203911 4939 scope.go:117] "RemoveContainer" containerID="1c4ba5b2df9c829d759d83f77c16f0877b7ffa853eb4bae269e1b0256e65a379" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.171711 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6"] Mar 18 16:45:00 crc kubenswrapper[4939]: E0318 16:45:00.172351 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.172372 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.172621 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.174323 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.176973 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.178297 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.185022 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6"] Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.212365 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.212473 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.212518 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhlxn\" (UniqueName: \"kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.313737 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.314051 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.314149 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhlxn\" (UniqueName: \"kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.315309 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.320804 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.337463 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhlxn\" (UniqueName: \"kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn\") pod \"collect-profiles-29564205-dcsh6\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:00 crc kubenswrapper[4939]: I0318 16:45:00.490076 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:01 crc kubenswrapper[4939]: I0318 16:45:01.016864 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6"] Mar 18 16:45:01 crc kubenswrapper[4939]: I0318 16:45:01.630228 4939 generic.go:334] "Generic (PLEG): container finished" podID="e9f14f20-580e-4ceb-8973-48517454057a" containerID="d2b7ad8427f9414749dbbb15e7e745438a56bc58b4827539a0afdaab07db36ba" exitCode=0 Mar 18 16:45:01 crc kubenswrapper[4939]: I0318 16:45:01.630442 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" event={"ID":"e9f14f20-580e-4ceb-8973-48517454057a","Type":"ContainerDied","Data":"d2b7ad8427f9414749dbbb15e7e745438a56bc58b4827539a0afdaab07db36ba"} Mar 18 16:45:01 crc kubenswrapper[4939]: I0318 16:45:01.630468 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" event={"ID":"e9f14f20-580e-4ceb-8973-48517454057a","Type":"ContainerStarted","Data":"4abb37a08019c88f9bb05e8bfaf66a1c516e6e5c1c38f5bc213daa79d9f2d7a1"} Mar 18 16:45:02 crc kubenswrapper[4939]: I0318 16:45:02.944766 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.046914 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume\") pod \"e9f14f20-580e-4ceb-8973-48517454057a\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.047047 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume\") pod \"e9f14f20-580e-4ceb-8973-48517454057a\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.047091 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhlxn\" (UniqueName: \"kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn\") pod \"e9f14f20-580e-4ceb-8973-48517454057a\" (UID: \"e9f14f20-580e-4ceb-8973-48517454057a\") " Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.047721 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9f14f20-580e-4ceb-8973-48517454057a" (UID: "e9f14f20-580e-4ceb-8973-48517454057a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.053616 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e9f14f20-580e-4ceb-8973-48517454057a" (UID: "e9f14f20-580e-4ceb-8973-48517454057a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.053671 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn" (OuterVolumeSpecName: "kube-api-access-fhlxn") pod "e9f14f20-580e-4ceb-8973-48517454057a" (UID: "e9f14f20-580e-4ceb-8973-48517454057a"). InnerVolumeSpecName "kube-api-access-fhlxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.148717 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f14f20-580e-4ceb-8973-48517454057a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.148765 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f14f20-580e-4ceb-8973-48517454057a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.148778 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhlxn\" (UniqueName: \"kubernetes.io/projected/e9f14f20-580e-4ceb-8973-48517454057a-kube-api-access-fhlxn\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.644432 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" event={"ID":"e9f14f20-580e-4ceb-8973-48517454057a","Type":"ContainerDied","Data":"4abb37a08019c88f9bb05e8bfaf66a1c516e6e5c1c38f5bc213daa79d9f2d7a1"} Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.644468 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4abb37a08019c88f9bb05e8bfaf66a1c516e6e5c1c38f5bc213daa79d9f2d7a1" Mar 18 16:45:03 crc kubenswrapper[4939]: I0318 16:45:03.644551 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6" Mar 18 16:45:04 crc kubenswrapper[4939]: I0318 16:45:04.022579 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7"] Mar 18 16:45:04 crc kubenswrapper[4939]: I0318 16:45:04.028306 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-mgvq7"] Mar 18 16:45:04 crc kubenswrapper[4939]: I0318 16:45:04.142398 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa194de0-0ca0-4455-8b05-bc0c4f4bb012" path="/var/lib/kubelet/pods/fa194de0-0ca0-4455-8b05-bc0c4f4bb012/volumes" Mar 18 16:45:05 crc kubenswrapper[4939]: I0318 16:45:05.132760 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:45:05 crc kubenswrapper[4939]: E0318 16:45:05.133203 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:45:17 crc kubenswrapper[4939]: I0318 16:45:17.133394 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:45:17 crc kubenswrapper[4939]: E0318 16:45:17.134364 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.359688 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:18 crc kubenswrapper[4939]: E0318 16:45:18.360126 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f14f20-580e-4ceb-8973-48517454057a" containerName="collect-profiles" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.360148 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f14f20-580e-4ceb-8973-48517454057a" containerName="collect-profiles" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.360390 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f14f20-580e-4ceb-8973-48517454057a" containerName="collect-profiles" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.361831 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.366152 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.461150 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trrq\" (UniqueName: \"kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.461224 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.461295 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.562094 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9trrq\" (UniqueName: \"kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.562159 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.562219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.562726 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.563020 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.590303 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9trrq\" (UniqueName: \"kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq\") pod \"certified-operators-mjrnw\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.678043 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:18 crc kubenswrapper[4939]: I0318 16:45:18.901053 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:19 crc kubenswrapper[4939]: I0318 16:45:19.755397 4939 generic.go:334] "Generic (PLEG): container finished" podID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerID="946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37" exitCode=0 Mar 18 16:45:19 crc kubenswrapper[4939]: I0318 16:45:19.755539 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerDied","Data":"946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37"} Mar 18 16:45:19 crc kubenswrapper[4939]: I0318 16:45:19.755802 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerStarted","Data":"b7cb8dd77f6e1f34868f36412372a5abff4d59c0cdd3d938798612e422b50e50"} Mar 18 16:45:20 crc kubenswrapper[4939]: I0318 16:45:20.763934 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerStarted","Data":"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19"} Mar 18 16:45:21 crc kubenswrapper[4939]: I0318 16:45:21.786894 4939 generic.go:334] "Generic (PLEG): container finished" podID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerID="2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19" exitCode=0 Mar 18 16:45:21 crc kubenswrapper[4939]: I0318 16:45:21.786957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerDied","Data":"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19"} Mar 18 16:45:22 crc kubenswrapper[4939]: I0318 16:45:22.794655 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerStarted","Data":"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a"} Mar 18 16:45:22 crc kubenswrapper[4939]: I0318 16:45:22.813625 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mjrnw" podStartSLOduration=2.244529485 podStartE2EDuration="4.813607197s" podCreationTimestamp="2026-03-18 16:45:18 +0000 UTC" firstStartedPulling="2026-03-18 16:45:19.75703573 +0000 UTC m=+4084.356223371" lastFinishedPulling="2026-03-18 16:45:22.326113462 +0000 UTC m=+4086.925301083" observedRunningTime="2026-03-18 16:45:22.810592132 +0000 UTC m=+4087.409779773" watchObservedRunningTime="2026-03-18 16:45:22.813607197 +0000 UTC m=+4087.412794818" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.133341 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:45:28 crc kubenswrapper[4939]: E0318 16:45:28.134430 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.679207 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.679289 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.720279 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.889405 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:28 crc kubenswrapper[4939]: I0318 16:45:28.951572 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:30 crc kubenswrapper[4939]: I0318 16:45:30.856775 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mjrnw" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="registry-server" containerID="cri-o://480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a" gracePeriod=2 Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.364189 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.466481 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content\") pod \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.466648 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities\") pod \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.466746 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9trrq\" (UniqueName: \"kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq\") pod \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\" (UID: \"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc\") " Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.468028 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities" (OuterVolumeSpecName: "utilities") pod "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" (UID: "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.483726 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq" (OuterVolumeSpecName: "kube-api-access-9trrq") pod "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" (UID: "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc"). InnerVolumeSpecName "kube-api-access-9trrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.519769 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" (UID: "15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.568952 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.568994 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9trrq\" (UniqueName: \"kubernetes.io/projected/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-kube-api-access-9trrq\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.569009 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.866056 4939 generic.go:334] "Generic (PLEG): container finished" podID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerID="480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a" exitCode=0 Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.866106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerDied","Data":"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a"} Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.866152 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrnw" event={"ID":"15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc","Type":"ContainerDied","Data":"b7cb8dd77f6e1f34868f36412372a5abff4d59c0cdd3d938798612e422b50e50"} Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.866174 4939 scope.go:117] "RemoveContainer" containerID="480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.866219 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrnw" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.900790 4939 scope.go:117] "RemoveContainer" containerID="2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.911707 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.924248 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mjrnw"] Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.946049 4939 scope.go:117] "RemoveContainer" containerID="946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.966672 4939 scope.go:117] "RemoveContainer" containerID="480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a" Mar 18 16:45:31 crc kubenswrapper[4939]: E0318 16:45:31.967051 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a\": container with ID starting with 480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a not found: ID does not exist" containerID="480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.967078 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a"} err="failed to get container status \"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a\": rpc error: code = NotFound desc = could not find container \"480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a\": container with ID starting with 480e4f9d2ea8636426abff240da5cd2393b855edab0b9f4697ff6d622fcec33a not found: ID does not exist" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.967100 4939 scope.go:117] "RemoveContainer" containerID="2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19" Mar 18 16:45:31 crc kubenswrapper[4939]: E0318 16:45:31.967375 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19\": container with ID starting with 2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19 not found: ID does not exist" containerID="2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.967420 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19"} err="failed to get container status \"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19\": rpc error: code = NotFound desc = could not find container \"2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19\": container with ID starting with 2aae62d914faafb3803c1983791826e4910d5b9a6d0dc5b147e9e4df3a55cc19 not found: ID does not exist" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.967453 4939 scope.go:117] "RemoveContainer" containerID="946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37" Mar 18 16:45:31 crc kubenswrapper[4939]: E0318 16:45:31.967923 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37\": container with ID starting with 946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37 not found: ID does not exist" containerID="946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37" Mar 18 16:45:31 crc kubenswrapper[4939]: I0318 16:45:31.967967 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37"} err="failed to get container status \"946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37\": rpc error: code = NotFound desc = could not find container \"946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37\": container with ID starting with 946f66a7153682da39c71c02fdc712290eb5127ddb6c39dd1233f4e2c6640c37 not found: ID does not exist" Mar 18 16:45:32 crc kubenswrapper[4939]: I0318 16:45:32.145323 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" path="/var/lib/kubelet/pods/15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc/volumes" Mar 18 16:45:39 crc kubenswrapper[4939]: I0318 16:45:39.132905 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:45:39 crc kubenswrapper[4939]: E0318 16:45:39.133681 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:45:54 crc kubenswrapper[4939]: I0318 16:45:54.133812 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:45:55 crc kubenswrapper[4939]: I0318 16:45:55.057125 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36"} Mar 18 16:45:59 crc kubenswrapper[4939]: I0318 16:45:59.264464 4939 scope.go:117] "RemoveContainer" containerID="5518cf49e77b3fd1887a505d35c9a4dc8f3f7d00238b2b5bbfc9686b720de62b" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.144972 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564206-xhrtj"] Mar 18 16:46:00 crc kubenswrapper[4939]: E0318 16:46:00.145334 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="extract-utilities" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.145361 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="extract-utilities" Mar 18 16:46:00 crc kubenswrapper[4939]: E0318 16:46:00.145379 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.145390 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4939]: E0318 16:46:00.145403 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="extract-content" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.145415 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="extract-content" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.145659 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="15245a9e-8fdd-4dea-bf0b-bc32ec1a94dc" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.146192 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.149056 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.149807 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.150274 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.155085 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-xhrtj"] Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.292864 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6p6\" (UniqueName: \"kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6\") pod \"auto-csr-approver-29564206-xhrtj\" (UID: \"988b8871-92e7-4aeb-b82d-e698df53b4c1\") " pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.394658 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6p6\" (UniqueName: \"kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6\") pod \"auto-csr-approver-29564206-xhrtj\" (UID: \"988b8871-92e7-4aeb-b82d-e698df53b4c1\") " pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.421449 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6p6\" (UniqueName: \"kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6\") pod \"auto-csr-approver-29564206-xhrtj\" (UID: \"988b8871-92e7-4aeb-b82d-e698df53b4c1\") " pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:00 crc kubenswrapper[4939]: I0318 16:46:00.472176 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:01 crc kubenswrapper[4939]: I0318 16:46:01.036793 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-xhrtj"] Mar 18 16:46:01 crc kubenswrapper[4939]: I0318 16:46:01.102233 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" event={"ID":"988b8871-92e7-4aeb-b82d-e698df53b4c1","Type":"ContainerStarted","Data":"8d123afffc3dc43e8ff4ed869e9529d3d117a1b2ee05fa451adbaf8c60942685"} Mar 18 16:46:03 crc kubenswrapper[4939]: I0318 16:46:03.119087 4939 generic.go:334] "Generic (PLEG): container finished" podID="988b8871-92e7-4aeb-b82d-e698df53b4c1" containerID="9c912e14a8661e595b7f849b34f4612b48f1eb5750ec72afd245a2c91df96c97" exitCode=0 Mar 18 16:46:03 crc kubenswrapper[4939]: I0318 16:46:03.119173 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" event={"ID":"988b8871-92e7-4aeb-b82d-e698df53b4c1","Type":"ContainerDied","Data":"9c912e14a8661e595b7f849b34f4612b48f1eb5750ec72afd245a2c91df96c97"} Mar 18 16:46:04 crc kubenswrapper[4939]: I0318 16:46:04.466854 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:04 crc kubenswrapper[4939]: I0318 16:46:04.554227 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k6p6\" (UniqueName: \"kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6\") pod \"988b8871-92e7-4aeb-b82d-e698df53b4c1\" (UID: \"988b8871-92e7-4aeb-b82d-e698df53b4c1\") " Mar 18 16:46:04 crc kubenswrapper[4939]: I0318 16:46:04.561283 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6" (OuterVolumeSpecName: "kube-api-access-5k6p6") pod "988b8871-92e7-4aeb-b82d-e698df53b4c1" (UID: "988b8871-92e7-4aeb-b82d-e698df53b4c1"). InnerVolumeSpecName "kube-api-access-5k6p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:46:04 crc kubenswrapper[4939]: I0318 16:46:04.656438 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k6p6\" (UniqueName: \"kubernetes.io/projected/988b8871-92e7-4aeb-b82d-e698df53b4c1-kube-api-access-5k6p6\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:05 crc kubenswrapper[4939]: I0318 16:46:05.176351 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" event={"ID":"988b8871-92e7-4aeb-b82d-e698df53b4c1","Type":"ContainerDied","Data":"8d123afffc3dc43e8ff4ed869e9529d3d117a1b2ee05fa451adbaf8c60942685"} Mar 18 16:46:05 crc kubenswrapper[4939]: I0318 16:46:05.176424 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d123afffc3dc43e8ff4ed869e9529d3d117a1b2ee05fa451adbaf8c60942685" Mar 18 16:46:05 crc kubenswrapper[4939]: I0318 16:46:05.176554 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-xhrtj" Mar 18 16:46:05 crc kubenswrapper[4939]: I0318 16:46:05.533595 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-s5vqx"] Mar 18 16:46:05 crc kubenswrapper[4939]: I0318 16:46:05.543168 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-s5vqx"] Mar 18 16:46:06 crc kubenswrapper[4939]: I0318 16:46:06.148633 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd72e48f-9a7a-40a3-97e2-da72baa08687" path="/var/lib/kubelet/pods/bd72e48f-9a7a-40a3-97e2-da72baa08687/volumes" Mar 18 16:46:11 crc kubenswrapper[4939]: I0318 16:46:11.968307 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:11 crc kubenswrapper[4939]: E0318 16:46:11.969359 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988b8871-92e7-4aeb-b82d-e698df53b4c1" containerName="oc" Mar 18 16:46:11 crc kubenswrapper[4939]: I0318 16:46:11.969383 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="988b8871-92e7-4aeb-b82d-e698df53b4c1" containerName="oc" Mar 18 16:46:11 crc kubenswrapper[4939]: I0318 16:46:11.969642 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="988b8871-92e7-4aeb-b82d-e698df53b4c1" containerName="oc" Mar 18 16:46:11 crc kubenswrapper[4939]: I0318 16:46:11.971233 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:11 crc kubenswrapper[4939]: I0318 16:46:11.982789 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.082868 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.082935 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9qt\" (UniqueName: \"kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.082982 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.183913 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.184184 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9qt\" (UniqueName: \"kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.184218 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.184642 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.184652 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.209439 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9qt\" (UniqueName: \"kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt\") pod \"community-operators-sdn4s\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.315439 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:12 crc kubenswrapper[4939]: I0318 16:46:12.785009 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:13 crc kubenswrapper[4939]: I0318 16:46:13.244576 4939 generic.go:334] "Generic (PLEG): container finished" podID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerID="3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554" exitCode=0 Mar 18 16:46:13 crc kubenswrapper[4939]: I0318 16:46:13.244666 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerDied","Data":"3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554"} Mar 18 16:46:13 crc kubenswrapper[4939]: I0318 16:46:13.244697 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerStarted","Data":"7261f3133c8b2b9a970bd5923a2af00b5e6dd2d8a1045de71ddac2d4802ca230"} Mar 18 16:46:14 crc kubenswrapper[4939]: I0318 16:46:14.255130 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerStarted","Data":"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e"} Mar 18 16:46:15 crc kubenswrapper[4939]: I0318 16:46:15.264239 4939 generic.go:334] "Generic (PLEG): container finished" podID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerID="c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e" exitCode=0 Mar 18 16:46:15 crc kubenswrapper[4939]: I0318 16:46:15.264287 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerDied","Data":"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e"} Mar 18 16:46:16 crc kubenswrapper[4939]: I0318 16:46:16.276625 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerStarted","Data":"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23"} Mar 18 16:46:16 crc kubenswrapper[4939]: I0318 16:46:16.295620 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdn4s" podStartSLOduration=2.865455046 podStartE2EDuration="5.295601125s" podCreationTimestamp="2026-03-18 16:46:11 +0000 UTC" firstStartedPulling="2026-03-18 16:46:13.247109407 +0000 UTC m=+4137.846297028" lastFinishedPulling="2026-03-18 16:46:15.677255486 +0000 UTC m=+4140.276443107" observedRunningTime="2026-03-18 16:46:16.293630379 +0000 UTC m=+4140.892818020" watchObservedRunningTime="2026-03-18 16:46:16.295601125 +0000 UTC m=+4140.894788746" Mar 18 16:46:22 crc kubenswrapper[4939]: I0318 16:46:22.316106 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:22 crc kubenswrapper[4939]: I0318 16:46:22.316843 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:22 crc kubenswrapper[4939]: I0318 16:46:22.385349 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:23 crc kubenswrapper[4939]: I0318 16:46:23.379892 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:23 crc kubenswrapper[4939]: I0318 16:46:23.438353 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.346402 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdn4s" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="registry-server" containerID="cri-o://151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23" gracePeriod=2 Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.761226 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.908742 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd9qt\" (UniqueName: \"kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt\") pod \"6a2c13d8-5dcc-4194-ac07-9859937b735c\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.908790 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content\") pod \"6a2c13d8-5dcc-4194-ac07-9859937b735c\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.908921 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities\") pod \"6a2c13d8-5dcc-4194-ac07-9859937b735c\" (UID: \"6a2c13d8-5dcc-4194-ac07-9859937b735c\") " Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.910847 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities" (OuterVolumeSpecName: "utilities") pod "6a2c13d8-5dcc-4194-ac07-9859937b735c" (UID: "6a2c13d8-5dcc-4194-ac07-9859937b735c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.916283 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt" (OuterVolumeSpecName: "kube-api-access-nd9qt") pod "6a2c13d8-5dcc-4194-ac07-9859937b735c" (UID: "6a2c13d8-5dcc-4194-ac07-9859937b735c"). InnerVolumeSpecName "kube-api-access-nd9qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:46:25 crc kubenswrapper[4939]: I0318 16:46:25.973544 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a2c13d8-5dcc-4194-ac07-9859937b735c" (UID: "6a2c13d8-5dcc-4194-ac07-9859937b735c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.010410 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.010452 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd9qt\" (UniqueName: \"kubernetes.io/projected/6a2c13d8-5dcc-4194-ac07-9859937b735c-kube-api-access-nd9qt\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.010551 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2c13d8-5dcc-4194-ac07-9859937b735c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.358232 4939 generic.go:334] "Generic (PLEG): container finished" podID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerID="151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23" exitCode=0 Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.358354 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerDied","Data":"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23"} Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.358422 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdn4s" event={"ID":"6a2c13d8-5dcc-4194-ac07-9859937b735c","Type":"ContainerDied","Data":"7261f3133c8b2b9a970bd5923a2af00b5e6dd2d8a1045de71ddac2d4802ca230"} Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.358469 4939 scope.go:117] "RemoveContainer" containerID="151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.358694 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdn4s" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.398496 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.403176 4939 scope.go:117] "RemoveContainer" containerID="c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.405895 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdn4s"] Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.429266 4939 scope.go:117] "RemoveContainer" containerID="3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.450011 4939 scope.go:117] "RemoveContainer" containerID="151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23" Mar 18 16:46:26 crc kubenswrapper[4939]: E0318 16:46:26.450427 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23\": container with ID starting with 151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23 not found: ID does not exist" containerID="151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.450461 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23"} err="failed to get container status \"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23\": rpc error: code = NotFound desc = could not find container \"151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23\": container with ID starting with 151313bb6da2ef6d409168c6112f1593c9fd0fc03e90c490712fac11f6e79e23 not found: ID does not exist" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.450484 4939 scope.go:117] "RemoveContainer" containerID="c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e" Mar 18 16:46:26 crc kubenswrapper[4939]: E0318 16:46:26.450834 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e\": container with ID starting with c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e not found: ID does not exist" containerID="c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.450853 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e"} err="failed to get container status \"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e\": rpc error: code = NotFound desc = could not find container \"c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e\": container with ID starting with c08bbdcc069976bfaa80c5dd87b7650fdd546faa9c96a4613bff52bbb4dcc98e not found: ID does not exist" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.450867 4939 scope.go:117] "RemoveContainer" containerID="3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554" Mar 18 16:46:26 crc kubenswrapper[4939]: E0318 16:46:26.451114 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554\": container with ID starting with 3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554 not found: ID does not exist" containerID="3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554" Mar 18 16:46:26 crc kubenswrapper[4939]: I0318 16:46:26.451139 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554"} err="failed to get container status \"3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554\": rpc error: code = NotFound desc = could not find container \"3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554\": container with ID starting with 3c5bd5321312e430cb8ee592f205409f90eddbc0791c822e7b6890dd1430a554 not found: ID does not exist" Mar 18 16:46:28 crc kubenswrapper[4939]: I0318 16:46:28.145921 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" path="/var/lib/kubelet/pods/6a2c13d8-5dcc-4194-ac07-9859937b735c/volumes" Mar 18 16:46:59 crc kubenswrapper[4939]: I0318 16:46:59.341642 4939 scope.go:117] "RemoveContainer" containerID="c05ea758df78162b6562fac2173de925718d1fb6abcb6246a39263a48577c735" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.145199 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564208-bmdlz"] Mar 18 16:48:00 crc kubenswrapper[4939]: E0318 16:48:00.146280 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="extract-content" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.146305 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="extract-content" Mar 18 16:48:00 crc kubenswrapper[4939]: E0318 16:48:00.146343 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="extract-utilities" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.146354 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="extract-utilities" Mar 18 16:48:00 crc kubenswrapper[4939]: E0318 16:48:00.146374 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="registry-server" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.146386 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="registry-server" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.146702 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2c13d8-5dcc-4194-ac07-9859937b735c" containerName="registry-server" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.147412 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.149456 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.150012 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.151008 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.152763 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-bmdlz"] Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.272257 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlnp6\" (UniqueName: \"kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6\") pod \"auto-csr-approver-29564208-bmdlz\" (UID: \"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022\") " pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.374223 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlnp6\" (UniqueName: \"kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6\") pod \"auto-csr-approver-29564208-bmdlz\" (UID: \"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022\") " pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.393932 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlnp6\" (UniqueName: \"kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6\") pod \"auto-csr-approver-29564208-bmdlz\" (UID: \"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022\") " pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.467850 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:00 crc kubenswrapper[4939]: I0318 16:48:00.873702 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-bmdlz"] Mar 18 16:48:01 crc kubenswrapper[4939]: I0318 16:48:01.056036 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" event={"ID":"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022","Type":"ContainerStarted","Data":"7b0acb93994b2303eb4175d760c05f0fae10fa34b8e1f4925c0a0f6bf223e760"} Mar 18 16:48:04 crc kubenswrapper[4939]: I0318 16:48:04.086689 4939 generic.go:334] "Generic (PLEG): container finished" podID="44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" containerID="61ac5428e779bcc638d6d867bab1a375dde19fc40407e3db24b707bf62f08f9b" exitCode=0 Mar 18 16:48:04 crc kubenswrapper[4939]: I0318 16:48:04.086743 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" event={"ID":"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022","Type":"ContainerDied","Data":"61ac5428e779bcc638d6d867bab1a375dde19fc40407e3db24b707bf62f08f9b"} Mar 18 16:48:05 crc kubenswrapper[4939]: I0318 16:48:05.383582 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:05 crc kubenswrapper[4939]: I0318 16:48:05.548988 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlnp6\" (UniqueName: \"kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6\") pod \"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022\" (UID: \"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022\") " Mar 18 16:48:05 crc kubenswrapper[4939]: I0318 16:48:05.556703 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6" (OuterVolumeSpecName: "kube-api-access-rlnp6") pod "44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" (UID: "44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022"). InnerVolumeSpecName "kube-api-access-rlnp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:48:05 crc kubenswrapper[4939]: I0318 16:48:05.651024 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlnp6\" (UniqueName: \"kubernetes.io/projected/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022-kube-api-access-rlnp6\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:06 crc kubenswrapper[4939]: I0318 16:48:06.101291 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" event={"ID":"44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022","Type":"ContainerDied","Data":"7b0acb93994b2303eb4175d760c05f0fae10fa34b8e1f4925c0a0f6bf223e760"} Mar 18 16:48:06 crc kubenswrapper[4939]: I0318 16:48:06.101589 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b0acb93994b2303eb4175d760c05f0fae10fa34b8e1f4925c0a0f6bf223e760" Mar 18 16:48:06 crc kubenswrapper[4939]: I0318 16:48:06.101304 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-bmdlz" Mar 18 16:48:06 crc kubenswrapper[4939]: I0318 16:48:06.456370 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-k4xpv"] Mar 18 16:48:06 crc kubenswrapper[4939]: I0318 16:48:06.463761 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-k4xpv"] Mar 18 16:48:08 crc kubenswrapper[4939]: I0318 16:48:08.143780 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f982d93-4356-408c-8d35-644ae609832c" path="/var/lib/kubelet/pods/1f982d93-4356-408c-8d35-644ae609832c/volumes" Mar 18 16:48:23 crc kubenswrapper[4939]: I0318 16:48:23.687295 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:48:23 crc kubenswrapper[4939]: I0318 16:48:23.688707 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:48:53 crc kubenswrapper[4939]: I0318 16:48:53.687582 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:48:53 crc kubenswrapper[4939]: I0318 16:48:53.688164 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:48:59 crc kubenswrapper[4939]: I0318 16:48:59.442697 4939 scope.go:117] "RemoveContainer" containerID="70b6dd0dc0dba7007ce4033782cc647f1f33d7d05d846e2d00c3274166370f38" Mar 18 16:49:23 crc kubenswrapper[4939]: I0318 16:49:23.687254 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:49:23 crc kubenswrapper[4939]: I0318 16:49:23.687783 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:49:23 crc kubenswrapper[4939]: I0318 16:49:23.687843 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:49:23 crc kubenswrapper[4939]: I0318 16:49:23.689277 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:49:23 crc kubenswrapper[4939]: I0318 16:49:23.689417 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36" gracePeriod=600 Mar 18 16:49:24 crc kubenswrapper[4939]: I0318 16:49:24.728566 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36" exitCode=0 Mar 18 16:49:24 crc kubenswrapper[4939]: I0318 16:49:24.728642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36"} Mar 18 16:49:24 crc kubenswrapper[4939]: I0318 16:49:24.729080 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be"} Mar 18 16:49:24 crc kubenswrapper[4939]: I0318 16:49:24.729118 4939 scope.go:117] "RemoveContainer" containerID="9db1932b4dd031c9374b2b4a0b4dbe23bc9402dec07ce7d4b69a17d985341ffe" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.149561 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564210-8dmdc"] Mar 18 16:50:00 crc kubenswrapper[4939]: E0318 16:50:00.151535 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.151568 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.151984 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" containerName="oc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.153703 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.157814 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.157954 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.158030 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.166666 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-8dmdc"] Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.307668 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84nwh\" (UniqueName: \"kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh\") pod \"auto-csr-approver-29564210-8dmdc\" (UID: \"5a12717b-5406-4aaa-989a-1de8b02fc9c7\") " pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.409708 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84nwh\" (UniqueName: \"kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh\") pod \"auto-csr-approver-29564210-8dmdc\" (UID: \"5a12717b-5406-4aaa-989a-1de8b02fc9c7\") " pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.437204 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84nwh\" (UniqueName: \"kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh\") pod \"auto-csr-approver-29564210-8dmdc\" (UID: \"5a12717b-5406-4aaa-989a-1de8b02fc9c7\") " pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.519255 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.932246 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-8dmdc"] Mar 18 16:50:00 crc kubenswrapper[4939]: W0318 16:50:00.944228 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a12717b_5406_4aaa_989a_1de8b02fc9c7.slice/crio-952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6 WatchSource:0}: Error finding container 952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6: Status 404 returned error can't find the container with id 952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6 Mar 18 16:50:00 crc kubenswrapper[4939]: I0318 16:50:00.950627 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:50:01 crc kubenswrapper[4939]: I0318 16:50:01.026235 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" event={"ID":"5a12717b-5406-4aaa-989a-1de8b02fc9c7","Type":"ContainerStarted","Data":"952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6"} Mar 18 16:50:03 crc kubenswrapper[4939]: I0318 16:50:03.044025 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a12717b-5406-4aaa-989a-1de8b02fc9c7" containerID="b2f74d03993074196d50541769f03729c3d00b1010e795e87abde64a8db9448e" exitCode=0 Mar 18 16:50:03 crc kubenswrapper[4939]: I0318 16:50:03.044106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" event={"ID":"5a12717b-5406-4aaa-989a-1de8b02fc9c7","Type":"ContainerDied","Data":"b2f74d03993074196d50541769f03729c3d00b1010e795e87abde64a8db9448e"} Mar 18 16:50:04 crc kubenswrapper[4939]: I0318 16:50:04.313766 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:04 crc kubenswrapper[4939]: I0318 16:50:04.464897 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84nwh\" (UniqueName: \"kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh\") pod \"5a12717b-5406-4aaa-989a-1de8b02fc9c7\" (UID: \"5a12717b-5406-4aaa-989a-1de8b02fc9c7\") " Mar 18 16:50:04 crc kubenswrapper[4939]: I0318 16:50:04.470656 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh" (OuterVolumeSpecName: "kube-api-access-84nwh") pod "5a12717b-5406-4aaa-989a-1de8b02fc9c7" (UID: "5a12717b-5406-4aaa-989a-1de8b02fc9c7"). InnerVolumeSpecName "kube-api-access-84nwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:50:04 crc kubenswrapper[4939]: I0318 16:50:04.566548 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84nwh\" (UniqueName: \"kubernetes.io/projected/5a12717b-5406-4aaa-989a-1de8b02fc9c7-kube-api-access-84nwh\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:05 crc kubenswrapper[4939]: I0318 16:50:05.061486 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" event={"ID":"5a12717b-5406-4aaa-989a-1de8b02fc9c7","Type":"ContainerDied","Data":"952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6"} Mar 18 16:50:05 crc kubenswrapper[4939]: I0318 16:50:05.061559 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952e5f270bdd4b4788a7b2320b72352f38f5c7a9b182c66cbb1130d7c5dcaba6" Mar 18 16:50:05 crc kubenswrapper[4939]: I0318 16:50:05.061579 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-8dmdc" Mar 18 16:50:05 crc kubenswrapper[4939]: I0318 16:50:05.377420 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-jnrpf"] Mar 18 16:50:05 crc kubenswrapper[4939]: I0318 16:50:05.382320 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-jnrpf"] Mar 18 16:50:06 crc kubenswrapper[4939]: I0318 16:50:06.143267 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4fe94a-9397-4a66-86ae-402dbb5bc5bf" path="/var/lib/kubelet/pods/ba4fe94a-9397-4a66-86ae-402dbb5bc5bf/volumes" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.506399 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:32 crc kubenswrapper[4939]: E0318 16:50:32.507372 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a12717b-5406-4aaa-989a-1de8b02fc9c7" containerName="oc" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.507392 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a12717b-5406-4aaa-989a-1de8b02fc9c7" containerName="oc" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.507616 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a12717b-5406-4aaa-989a-1de8b02fc9c7" containerName="oc" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.508926 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.530400 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.605106 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4w6\" (UniqueName: \"kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.605160 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.605188 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.707261 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4w6\" (UniqueName: \"kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.707318 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.707348 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.707875 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.707936 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.734437 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4w6\" (UniqueName: \"kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6\") pod \"redhat-operators-v8whh\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:32 crc kubenswrapper[4939]: I0318 16:50:32.837601 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:33 crc kubenswrapper[4939]: I0318 16:50:33.275980 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:34 crc kubenswrapper[4939]: I0318 16:50:34.268868 4939 generic.go:334] "Generic (PLEG): container finished" podID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerID="9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8" exitCode=0 Mar 18 16:50:34 crc kubenswrapper[4939]: I0318 16:50:34.269005 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerDied","Data":"9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8"} Mar 18 16:50:34 crc kubenswrapper[4939]: I0318 16:50:34.269239 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerStarted","Data":"09916171844aa5bc7dafbacad75a5fcb1c479d537060066a7dd79ad1b7db2d91"} Mar 18 16:50:35 crc kubenswrapper[4939]: I0318 16:50:35.277458 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerStarted","Data":"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b"} Mar 18 16:50:36 crc kubenswrapper[4939]: I0318 16:50:36.291256 4939 generic.go:334] "Generic (PLEG): container finished" podID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerID="84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b" exitCode=0 Mar 18 16:50:36 crc kubenswrapper[4939]: I0318 16:50:36.291389 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerDied","Data":"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b"} Mar 18 16:50:37 crc kubenswrapper[4939]: I0318 16:50:37.302348 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerStarted","Data":"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf"} Mar 18 16:50:37 crc kubenswrapper[4939]: I0318 16:50:37.324843 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8whh" podStartSLOduration=2.736147989 podStartE2EDuration="5.324818833s" podCreationTimestamp="2026-03-18 16:50:32 +0000 UTC" firstStartedPulling="2026-03-18 16:50:34.270863177 +0000 UTC m=+4398.870050798" lastFinishedPulling="2026-03-18 16:50:36.859534021 +0000 UTC m=+4401.458721642" observedRunningTime="2026-03-18 16:50:37.321817388 +0000 UTC m=+4401.921004999" watchObservedRunningTime="2026-03-18 16:50:37.324818833 +0000 UTC m=+4401.924006454" Mar 18 16:50:42 crc kubenswrapper[4939]: I0318 16:50:42.838778 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:42 crc kubenswrapper[4939]: I0318 16:50:42.839355 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:43 crc kubenswrapper[4939]: I0318 16:50:43.876242 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8whh" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="registry-server" probeResult="failure" output=< Mar 18 16:50:43 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 16:50:43 crc kubenswrapper[4939]: > Mar 18 16:50:52 crc kubenswrapper[4939]: I0318 16:50:52.884416 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:52 crc kubenswrapper[4939]: I0318 16:50:52.939349 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:53 crc kubenswrapper[4939]: I0318 16:50:53.122839 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:54 crc kubenswrapper[4939]: I0318 16:50:54.438427 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8whh" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="registry-server" containerID="cri-o://31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf" gracePeriod=2 Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.326640 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.446668 4939 generic.go:334] "Generic (PLEG): container finished" podID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerID="31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf" exitCode=0 Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.446718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerDied","Data":"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf"} Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.446749 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8whh" event={"ID":"2d1f1174-38d5-47cb-ae10-228f67e5a825","Type":"ContainerDied","Data":"09916171844aa5bc7dafbacad75a5fcb1c479d537060066a7dd79ad1b7db2d91"} Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.446771 4939 scope.go:117] "RemoveContainer" containerID="31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.446917 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8whh" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.468769 4939 scope.go:117] "RemoveContainer" containerID="84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.490560 4939 scope.go:117] "RemoveContainer" containerID="9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.512925 4939 scope.go:117] "RemoveContainer" containerID="31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf" Mar 18 16:50:55 crc kubenswrapper[4939]: E0318 16:50:55.513411 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf\": container with ID starting with 31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf not found: ID does not exist" containerID="31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.513457 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf"} err="failed to get container status \"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf\": rpc error: code = NotFound desc = could not find container \"31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf\": container with ID starting with 31a3be24b7ba893c6402f115f0b48e9ff7da04753fc509df5be59bca1720debf not found: ID does not exist" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.513485 4939 scope.go:117] "RemoveContainer" containerID="84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b" Mar 18 16:50:55 crc kubenswrapper[4939]: E0318 16:50:55.513907 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b\": container with ID starting with 84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b not found: ID does not exist" containerID="84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.513929 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b"} err="failed to get container status \"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b\": rpc error: code = NotFound desc = could not find container \"84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b\": container with ID starting with 84ae10d03403bdb97787292dbd73a12ea10053824e4483688a230e07ce6bb85b not found: ID does not exist" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.513940 4939 scope.go:117] "RemoveContainer" containerID="9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8" Mar 18 16:50:55 crc kubenswrapper[4939]: E0318 16:50:55.514350 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8\": container with ID starting with 9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8 not found: ID does not exist" containerID="9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.514376 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8"} err="failed to get container status \"9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8\": rpc error: code = NotFound desc = could not find container \"9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8\": container with ID starting with 9e860e5af95b7da3480b79055687782aa856f7fb9500d35c27ef5c2f456f16e8 not found: ID does not exist" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.520924 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj4w6\" (UniqueName: \"kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6\") pod \"2d1f1174-38d5-47cb-ae10-228f67e5a825\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.520971 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities\") pod \"2d1f1174-38d5-47cb-ae10-228f67e5a825\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.521014 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content\") pod \"2d1f1174-38d5-47cb-ae10-228f67e5a825\" (UID: \"2d1f1174-38d5-47cb-ae10-228f67e5a825\") " Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.522646 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities" (OuterVolumeSpecName: "utilities") pod "2d1f1174-38d5-47cb-ae10-228f67e5a825" (UID: "2d1f1174-38d5-47cb-ae10-228f67e5a825"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.527861 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6" (OuterVolumeSpecName: "kube-api-access-qj4w6") pod "2d1f1174-38d5-47cb-ae10-228f67e5a825" (UID: "2d1f1174-38d5-47cb-ae10-228f67e5a825"). InnerVolumeSpecName "kube-api-access-qj4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.623199 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj4w6\" (UniqueName: \"kubernetes.io/projected/2d1f1174-38d5-47cb-ae10-228f67e5a825-kube-api-access-qj4w6\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.623270 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.665159 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d1f1174-38d5-47cb-ae10-228f67e5a825" (UID: "2d1f1174-38d5-47cb-ae10-228f67e5a825"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.724846 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d1f1174-38d5-47cb-ae10-228f67e5a825-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.783605 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:55 crc kubenswrapper[4939]: I0318 16:50:55.791034 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8whh"] Mar 18 16:50:56 crc kubenswrapper[4939]: I0318 16:50:56.143390 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" path="/var/lib/kubelet/pods/2d1f1174-38d5-47cb-ae10-228f67e5a825/volumes" Mar 18 16:50:59 crc kubenswrapper[4939]: I0318 16:50:59.521532 4939 scope.go:117] "RemoveContainer" containerID="b61adc5cdaf45d57a6677a675b78b1e3b5555ae80ef398f56c02a22cdb568fc0" Mar 18 16:51:53 crc kubenswrapper[4939]: I0318 16:51:53.687798 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:51:53 crc kubenswrapper[4939]: I0318 16:51:53.688386 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.143629 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564212-z29g9"] Mar 18 16:52:00 crc kubenswrapper[4939]: E0318 16:52:00.146666 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="extract-utilities" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.146952 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="extract-utilities" Mar 18 16:52:00 crc kubenswrapper[4939]: E0318 16:52:00.147039 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="extract-content" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.147112 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="extract-content" Mar 18 16:52:00 crc kubenswrapper[4939]: E0318 16:52:00.147196 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="registry-server" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.147266 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="registry-server" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.147534 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1f1174-38d5-47cb-ae10-228f67e5a825" containerName="registry-server" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.148196 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.150965 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.151006 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.151068 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-z29g9"] Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.151487 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.328273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sfwn\" (UniqueName: \"kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn\") pod \"auto-csr-approver-29564212-z29g9\" (UID: \"fc634fd1-df6f-4086-95b7-d1b644c048d0\") " pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.430096 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sfwn\" (UniqueName: \"kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn\") pod \"auto-csr-approver-29564212-z29g9\" (UID: \"fc634fd1-df6f-4086-95b7-d1b644c048d0\") " pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.447882 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sfwn\" (UniqueName: \"kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn\") pod \"auto-csr-approver-29564212-z29g9\" (UID: \"fc634fd1-df6f-4086-95b7-d1b644c048d0\") " pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:00 crc kubenswrapper[4939]: I0318 16:52:00.504883 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:01 crc kubenswrapper[4939]: I0318 16:52:01.340432 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-z29g9"] Mar 18 16:52:01 crc kubenswrapper[4939]: I0318 16:52:01.977469 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-z29g9" event={"ID":"fc634fd1-df6f-4086-95b7-d1b644c048d0","Type":"ContainerStarted","Data":"12957c2c7ab02721f8bbf124ebeba89af67a4ed33289311a257dd08083761fdb"} Mar 18 16:52:02 crc kubenswrapper[4939]: I0318 16:52:02.984349 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-z29g9" event={"ID":"fc634fd1-df6f-4086-95b7-d1b644c048d0","Type":"ContainerStarted","Data":"1adf7f728affe4d8b9fda3dc805ce891fed174baab991b3882d96b94bd54e10a"} Mar 18 16:52:03 crc kubenswrapper[4939]: I0318 16:52:03.002100 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564212-z29g9" podStartSLOduration=1.8092796519999998 podStartE2EDuration="3.002080667s" podCreationTimestamp="2026-03-18 16:52:00 +0000 UTC" firstStartedPulling="2026-03-18 16:52:01.346121638 +0000 UTC m=+4485.945309259" lastFinishedPulling="2026-03-18 16:52:02.538922653 +0000 UTC m=+4487.138110274" observedRunningTime="2026-03-18 16:52:03.0004193 +0000 UTC m=+4487.599606941" watchObservedRunningTime="2026-03-18 16:52:03.002080667 +0000 UTC m=+4487.601268288" Mar 18 16:52:03 crc kubenswrapper[4939]: I0318 16:52:03.993463 4939 generic.go:334] "Generic (PLEG): container finished" podID="fc634fd1-df6f-4086-95b7-d1b644c048d0" containerID="1adf7f728affe4d8b9fda3dc805ce891fed174baab991b3882d96b94bd54e10a" exitCode=0 Mar 18 16:52:03 crc kubenswrapper[4939]: I0318 16:52:03.993566 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-z29g9" event={"ID":"fc634fd1-df6f-4086-95b7-d1b644c048d0","Type":"ContainerDied","Data":"1adf7f728affe4d8b9fda3dc805ce891fed174baab991b3882d96b94bd54e10a"} Mar 18 16:52:05 crc kubenswrapper[4939]: I0318 16:52:05.259034 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:05 crc kubenswrapper[4939]: I0318 16:52:05.409244 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sfwn\" (UniqueName: \"kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn\") pod \"fc634fd1-df6f-4086-95b7-d1b644c048d0\" (UID: \"fc634fd1-df6f-4086-95b7-d1b644c048d0\") " Mar 18 16:52:05 crc kubenswrapper[4939]: I0318 16:52:05.415391 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn" (OuterVolumeSpecName: "kube-api-access-7sfwn") pod "fc634fd1-df6f-4086-95b7-d1b644c048d0" (UID: "fc634fd1-df6f-4086-95b7-d1b644c048d0"). InnerVolumeSpecName "kube-api-access-7sfwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:52:05 crc kubenswrapper[4939]: I0318 16:52:05.511412 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sfwn\" (UniqueName: \"kubernetes.io/projected/fc634fd1-df6f-4086-95b7-d1b644c048d0-kube-api-access-7sfwn\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.009813 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-z29g9" event={"ID":"fc634fd1-df6f-4086-95b7-d1b644c048d0","Type":"ContainerDied","Data":"12957c2c7ab02721f8bbf124ebeba89af67a4ed33289311a257dd08083761fdb"} Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.009908 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12957c2c7ab02721f8bbf124ebeba89af67a4ed33289311a257dd08083761fdb" Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.009835 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-z29g9" Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.070451 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-xhrtj"] Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.080681 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-xhrtj"] Mar 18 16:52:06 crc kubenswrapper[4939]: I0318 16:52:06.141770 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988b8871-92e7-4aeb-b82d-e698df53b4c1" path="/var/lib/kubelet/pods/988b8871-92e7-4aeb-b82d-e698df53b4c1/volumes" Mar 18 16:52:23 crc kubenswrapper[4939]: I0318 16:52:23.687794 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:52:23 crc kubenswrapper[4939]: I0318 16:52:23.688787 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.288917 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:52:46 crc kubenswrapper[4939]: E0318 16:52:46.289635 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc634fd1-df6f-4086-95b7-d1b644c048d0" containerName="oc" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.289648 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc634fd1-df6f-4086-95b7-d1b644c048d0" containerName="oc" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.292218 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc634fd1-df6f-4086-95b7-d1b644c048d0" containerName="oc" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.294875 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.301716 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.334567 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.334644 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.334736 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rd22\" (UniqueName: \"kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.435885 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.435940 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.436017 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rd22\" (UniqueName: \"kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.437206 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.437210 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.461490 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rd22\" (UniqueName: \"kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22\") pod \"redhat-marketplace-7mffb\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:46 crc kubenswrapper[4939]: I0318 16:52:46.622559 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:47 crc kubenswrapper[4939]: I0318 16:52:47.085039 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:52:47 crc kubenswrapper[4939]: I0318 16:52:47.326346 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerStarted","Data":"14bb4e241bef482d2226c1837affd0df5fc201ec0c1246a7fa4f60f2ae4e3216"} Mar 18 16:52:48 crc kubenswrapper[4939]: I0318 16:52:48.335375 4939 generic.go:334] "Generic (PLEG): container finished" podID="35eb2487-57fa-44bb-be83-07b8f1688300" containerID="7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5" exitCode=0 Mar 18 16:52:48 crc kubenswrapper[4939]: I0318 16:52:48.335485 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerDied","Data":"7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5"} Mar 18 16:52:49 crc kubenswrapper[4939]: I0318 16:52:49.350495 4939 generic.go:334] "Generic (PLEG): container finished" podID="35eb2487-57fa-44bb-be83-07b8f1688300" containerID="b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252" exitCode=0 Mar 18 16:52:49 crc kubenswrapper[4939]: I0318 16:52:49.350652 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerDied","Data":"b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252"} Mar 18 16:52:50 crc kubenswrapper[4939]: I0318 16:52:50.359723 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerStarted","Data":"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2"} Mar 18 16:52:50 crc kubenswrapper[4939]: I0318 16:52:50.380566 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7mffb" podStartSLOduration=2.983433029 podStartE2EDuration="4.380547118s" podCreationTimestamp="2026-03-18 16:52:46 +0000 UTC" firstStartedPulling="2026-03-18 16:52:48.336922012 +0000 UTC m=+4532.936109633" lastFinishedPulling="2026-03-18 16:52:49.734036101 +0000 UTC m=+4534.333223722" observedRunningTime="2026-03-18 16:52:50.37531828 +0000 UTC m=+4534.974505911" watchObservedRunningTime="2026-03-18 16:52:50.380547118 +0000 UTC m=+4534.979734729" Mar 18 16:52:53 crc kubenswrapper[4939]: I0318 16:52:53.688127 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:52:53 crc kubenswrapper[4939]: I0318 16:52:53.688528 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:52:53 crc kubenswrapper[4939]: I0318 16:52:53.688581 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 16:52:53 crc kubenswrapper[4939]: I0318 16:52:53.689364 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:52:53 crc kubenswrapper[4939]: I0318 16:52:53.689457 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" gracePeriod=600 Mar 18 16:52:53 crc kubenswrapper[4939]: E0318 16:52:53.814012 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:52:54 crc kubenswrapper[4939]: I0318 16:52:54.392572 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" exitCode=0 Mar 18 16:52:54 crc kubenswrapper[4939]: I0318 16:52:54.392622 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be"} Mar 18 16:52:54 crc kubenswrapper[4939]: I0318 16:52:54.392847 4939 scope.go:117] "RemoveContainer" containerID="c1529535b04797f3170ae75a75a13f6e75bee36d6cb1f01ed3ee7ded473eae36" Mar 18 16:52:54 crc kubenswrapper[4939]: I0318 16:52:54.393535 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:52:54 crc kubenswrapper[4939]: E0318 16:52:54.393880 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:52:56 crc kubenswrapper[4939]: I0318 16:52:56.622794 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:56 crc kubenswrapper[4939]: I0318 16:52:56.623285 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:56 crc kubenswrapper[4939]: I0318 16:52:56.667599 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:57 crc kubenswrapper[4939]: I0318 16:52:57.496176 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:52:57 crc kubenswrapper[4939]: I0318 16:52:57.550185 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:52:59 crc kubenswrapper[4939]: I0318 16:52:59.432371 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7mffb" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="registry-server" containerID="cri-o://45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2" gracePeriod=2 Mar 18 16:52:59 crc kubenswrapper[4939]: I0318 16:52:59.619113 4939 scope.go:117] "RemoveContainer" containerID="9c912e14a8661e595b7f849b34f4612b48f1eb5750ec72afd245a2c91df96c97" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.324147 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.437218 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rd22\" (UniqueName: \"kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22\") pod \"35eb2487-57fa-44bb-be83-07b8f1688300\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.438381 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content\") pod \"35eb2487-57fa-44bb-be83-07b8f1688300\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.438425 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities\") pod \"35eb2487-57fa-44bb-be83-07b8f1688300\" (UID: \"35eb2487-57fa-44bb-be83-07b8f1688300\") " Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.439566 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities" (OuterVolumeSpecName: "utilities") pod "35eb2487-57fa-44bb-be83-07b8f1688300" (UID: "35eb2487-57fa-44bb-be83-07b8f1688300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.441835 4939 generic.go:334] "Generic (PLEG): container finished" podID="35eb2487-57fa-44bb-be83-07b8f1688300" containerID="45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2" exitCode=0 Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.441874 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerDied","Data":"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2"} Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.441902 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mffb" event={"ID":"35eb2487-57fa-44bb-be83-07b8f1688300","Type":"ContainerDied","Data":"14bb4e241bef482d2226c1837affd0df5fc201ec0c1246a7fa4f60f2ae4e3216"} Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.441918 4939 scope.go:117] "RemoveContainer" containerID="45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.442153 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mffb" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.444885 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22" (OuterVolumeSpecName: "kube-api-access-4rd22") pod "35eb2487-57fa-44bb-be83-07b8f1688300" (UID: "35eb2487-57fa-44bb-be83-07b8f1688300"). InnerVolumeSpecName "kube-api-access-4rd22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.476868 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35eb2487-57fa-44bb-be83-07b8f1688300" (UID: "35eb2487-57fa-44bb-be83-07b8f1688300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.479543 4939 scope.go:117] "RemoveContainer" containerID="b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.501756 4939 scope.go:117] "RemoveContainer" containerID="7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.519910 4939 scope.go:117] "RemoveContainer" containerID="45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2" Mar 18 16:53:00 crc kubenswrapper[4939]: E0318 16:53:00.520339 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2\": container with ID starting with 45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2 not found: ID does not exist" containerID="45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.520419 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2"} err="failed to get container status \"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2\": rpc error: code = NotFound desc = could not find container \"45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2\": container with ID starting with 45e497c01506eda35378734e39b0415119e440609c221a9a05a72e47584285d2 not found: ID does not exist" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.520494 4939 scope.go:117] "RemoveContainer" containerID="b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252" Mar 18 16:53:00 crc kubenswrapper[4939]: E0318 16:53:00.521421 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252\": container with ID starting with b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252 not found: ID does not exist" containerID="b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.521458 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252"} err="failed to get container status \"b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252\": rpc error: code = NotFound desc = could not find container \"b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252\": container with ID starting with b077eca4353c8b608997b4df834d2f63413f539d696e6eaed02da36c9dfeb252 not found: ID does not exist" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.521496 4939 scope.go:117] "RemoveContainer" containerID="7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5" Mar 18 16:53:00 crc kubenswrapper[4939]: E0318 16:53:00.522225 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5\": container with ID starting with 7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5 not found: ID does not exist" containerID="7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.522275 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5"} err="failed to get container status \"7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5\": rpc error: code = NotFound desc = could not find container \"7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5\": container with ID starting with 7b023e347d5ab6dd30d73f1659e955a993a2d1ba29e782d22ed15e7fe6639ca5 not found: ID does not exist" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.540577 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.540624 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35eb2487-57fa-44bb-be83-07b8f1688300-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.540656 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rd22\" (UniqueName: \"kubernetes.io/projected/35eb2487-57fa-44bb-be83-07b8f1688300-kube-api-access-4rd22\") on node \"crc\" DevicePath \"\"" Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.772383 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:53:00 crc kubenswrapper[4939]: I0318 16:53:00.779924 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mffb"] Mar 18 16:53:02 crc kubenswrapper[4939]: I0318 16:53:02.146005 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" path="/var/lib/kubelet/pods/35eb2487-57fa-44bb-be83-07b8f1688300/volumes" Mar 18 16:53:07 crc kubenswrapper[4939]: I0318 16:53:07.133176 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:53:07 crc kubenswrapper[4939]: E0318 16:53:07.134228 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:53:21 crc kubenswrapper[4939]: I0318 16:53:21.132691 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:53:21 crc kubenswrapper[4939]: E0318 16:53:21.133337 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:53:36 crc kubenswrapper[4939]: I0318 16:53:36.139431 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:53:36 crc kubenswrapper[4939]: E0318 16:53:36.140284 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:53:50 crc kubenswrapper[4939]: I0318 16:53:50.133610 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:53:50 crc kubenswrapper[4939]: E0318 16:53:50.134234 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.146472 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564214-n4cml"] Mar 18 16:54:00 crc kubenswrapper[4939]: E0318 16:54:00.147255 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="registry-server" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.147269 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="registry-server" Mar 18 16:54:00 crc kubenswrapper[4939]: E0318 16:54:00.147294 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="extract-content" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.147303 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="extract-content" Mar 18 16:54:00 crc kubenswrapper[4939]: E0318 16:54:00.147319 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="extract-utilities" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.147327 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="extract-utilities" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.147488 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="35eb2487-57fa-44bb-be83-07b8f1688300" containerName="registry-server" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.148019 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.149157 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-n4cml"] Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.150097 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.150201 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.150334 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.331703 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbzb\" (UniqueName: \"kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb\") pod \"auto-csr-approver-29564214-n4cml\" (UID: \"9b025d40-8c7d-437c-9068-f2a9709185e1\") " pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.433118 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbzb\" (UniqueName: \"kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb\") pod \"auto-csr-approver-29564214-n4cml\" (UID: \"9b025d40-8c7d-437c-9068-f2a9709185e1\") " pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.465661 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbzb\" (UniqueName: \"kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb\") pod \"auto-csr-approver-29564214-n4cml\" (UID: \"9b025d40-8c7d-437c-9068-f2a9709185e1\") " pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.492735 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:00 crc kubenswrapper[4939]: I0318 16:54:00.892701 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-n4cml"] Mar 18 16:54:01 crc kubenswrapper[4939]: I0318 16:54:01.450816 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-n4cml" event={"ID":"9b025d40-8c7d-437c-9068-f2a9709185e1","Type":"ContainerStarted","Data":"b43cb998fdebc00805d65faea8bb1dc2425fc94b179ba7ae4bd6247c4ea95e0b"} Mar 18 16:54:03 crc kubenswrapper[4939]: I0318 16:54:03.133430 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:54:03 crc kubenswrapper[4939]: E0318 16:54:03.134260 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:03 crc kubenswrapper[4939]: I0318 16:54:03.467414 4939 generic.go:334] "Generic (PLEG): container finished" podID="9b025d40-8c7d-437c-9068-f2a9709185e1" containerID="1e2c55e7cc1e982c0c6586f59c782e5736028a183a4b700ff0c424646dc97d6d" exitCode=0 Mar 18 16:54:03 crc kubenswrapper[4939]: I0318 16:54:03.467472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-n4cml" event={"ID":"9b025d40-8c7d-437c-9068-f2a9709185e1","Type":"ContainerDied","Data":"1e2c55e7cc1e982c0c6586f59c782e5736028a183a4b700ff0c424646dc97d6d"} Mar 18 16:54:04 crc kubenswrapper[4939]: I0318 16:54:04.739344 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:04 crc kubenswrapper[4939]: I0318 16:54:04.908389 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbzb\" (UniqueName: \"kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb\") pod \"9b025d40-8c7d-437c-9068-f2a9709185e1\" (UID: \"9b025d40-8c7d-437c-9068-f2a9709185e1\") " Mar 18 16:54:04 crc kubenswrapper[4939]: I0318 16:54:04.915701 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb" (OuterVolumeSpecName: "kube-api-access-4wbzb") pod "9b025d40-8c7d-437c-9068-f2a9709185e1" (UID: "9b025d40-8c7d-437c-9068-f2a9709185e1"). InnerVolumeSpecName "kube-api-access-4wbzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.010563 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbzb\" (UniqueName: \"kubernetes.io/projected/9b025d40-8c7d-437c-9068-f2a9709185e1-kube-api-access-4wbzb\") on node \"crc\" DevicePath \"\"" Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.481987 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-n4cml" event={"ID":"9b025d40-8c7d-437c-9068-f2a9709185e1","Type":"ContainerDied","Data":"b43cb998fdebc00805d65faea8bb1dc2425fc94b179ba7ae4bd6247c4ea95e0b"} Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.482363 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43cb998fdebc00805d65faea8bb1dc2425fc94b179ba7ae4bd6247c4ea95e0b" Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.482046 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-n4cml" Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.810969 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-bmdlz"] Mar 18 16:54:05 crc kubenswrapper[4939]: I0318 16:54:05.817825 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-bmdlz"] Mar 18 16:54:06 crc kubenswrapper[4939]: I0318 16:54:06.141173 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022" path="/var/lib/kubelet/pods/44dcbd2e-c2b1-4ebc-b2cf-c331c82e3022/volumes" Mar 18 16:54:16 crc kubenswrapper[4939]: I0318 16:54:16.139555 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:54:16 crc kubenswrapper[4939]: E0318 16:54:16.140574 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:27 crc kubenswrapper[4939]: I0318 16:54:27.133731 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:54:27 crc kubenswrapper[4939]: E0318 16:54:27.134447 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:38 crc kubenswrapper[4939]: I0318 16:54:38.133750 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:54:38 crc kubenswrapper[4939]: E0318 16:54:38.134320 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:52 crc kubenswrapper[4939]: I0318 16:54:52.135476 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:54:52 crc kubenswrapper[4939]: E0318 16:54:52.138207 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:54:59 crc kubenswrapper[4939]: I0318 16:54:59.720607 4939 scope.go:117] "RemoveContainer" containerID="61ac5428e779bcc638d6d867bab1a375dde19fc40407e3db24b707bf62f08f9b" Mar 18 16:55:06 crc kubenswrapper[4939]: I0318 16:55:06.139790 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:55:06 crc kubenswrapper[4939]: E0318 16:55:06.141045 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:55:18 crc kubenswrapper[4939]: I0318 16:55:18.133382 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:55:18 crc kubenswrapper[4939]: E0318 16:55:18.134126 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.486989 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:20 crc kubenswrapper[4939]: E0318 16:55:20.487350 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b025d40-8c7d-437c-9068-f2a9709185e1" containerName="oc" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.487365 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b025d40-8c7d-437c-9068-f2a9709185e1" containerName="oc" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.487545 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b025d40-8c7d-437c-9068-f2a9709185e1" containerName="oc" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.488725 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.499933 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.593708 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkq5p\" (UniqueName: \"kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.594126 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.594248 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.695267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.695359 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.695392 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkq5p\" (UniqueName: \"kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.695755 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.695791 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:20 crc kubenswrapper[4939]: I0318 16:55:20.972709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkq5p\" (UniqueName: \"kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p\") pod \"certified-operators-2r4tz\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:21 crc kubenswrapper[4939]: I0318 16:55:21.106224 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:21 crc kubenswrapper[4939]: I0318 16:55:21.527441 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:21 crc kubenswrapper[4939]: W0318 16:55:21.550856 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8d2b65_6b99_4f6b_af6d_c94b9c392be5.slice/crio-7b75a7ee4293b5341d7198c99ce94d3d23e0ceb03775bb24cf7c373a496677c1 WatchSource:0}: Error finding container 7b75a7ee4293b5341d7198c99ce94d3d23e0ceb03775bb24cf7c373a496677c1: Status 404 returned error can't find the container with id 7b75a7ee4293b5341d7198c99ce94d3d23e0ceb03775bb24cf7c373a496677c1 Mar 18 16:55:22 crc kubenswrapper[4939]: I0318 16:55:22.086936 4939 generic.go:334] "Generic (PLEG): container finished" podID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerID="85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a" exitCode=0 Mar 18 16:55:22 crc kubenswrapper[4939]: I0318 16:55:22.087029 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerDied","Data":"85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a"} Mar 18 16:55:22 crc kubenswrapper[4939]: I0318 16:55:22.087220 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerStarted","Data":"7b75a7ee4293b5341d7198c99ce94d3d23e0ceb03775bb24cf7c373a496677c1"} Mar 18 16:55:22 crc kubenswrapper[4939]: I0318 16:55:22.090120 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:55:24 crc kubenswrapper[4939]: I0318 16:55:24.104493 4939 generic.go:334] "Generic (PLEG): container finished" podID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerID="59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50" exitCode=0 Mar 18 16:55:24 crc kubenswrapper[4939]: I0318 16:55:24.104593 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerDied","Data":"59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50"} Mar 18 16:55:25 crc kubenswrapper[4939]: I0318 16:55:25.114662 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerStarted","Data":"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90"} Mar 18 16:55:25 crc kubenswrapper[4939]: I0318 16:55:25.140875 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2r4tz" podStartSLOduration=2.678596106 podStartE2EDuration="5.140854656s" podCreationTimestamp="2026-03-18 16:55:20 +0000 UTC" firstStartedPulling="2026-03-18 16:55:22.089756623 +0000 UTC m=+4686.688944274" lastFinishedPulling="2026-03-18 16:55:24.552015203 +0000 UTC m=+4689.151202824" observedRunningTime="2026-03-18 16:55:25.138447498 +0000 UTC m=+4689.737635129" watchObservedRunningTime="2026-03-18 16:55:25.140854656 +0000 UTC m=+4689.740042277" Mar 18 16:55:31 crc kubenswrapper[4939]: I0318 16:55:31.107459 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:31 crc kubenswrapper[4939]: I0318 16:55:31.107987 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:31 crc kubenswrapper[4939]: I0318 16:55:31.168702 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:31 crc kubenswrapper[4939]: I0318 16:55:31.715871 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:31 crc kubenswrapper[4939]: I0318 16:55:31.757045 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.133465 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:55:33 crc kubenswrapper[4939]: E0318 16:55:33.134028 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.176817 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2r4tz" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="registry-server" containerID="cri-o://de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90" gracePeriod=2 Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.556262 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.691423 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities\") pod \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.691775 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkq5p\" (UniqueName: \"kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p\") pod \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.691932 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content\") pod \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\" (UID: \"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5\") " Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.692399 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities" (OuterVolumeSpecName: "utilities") pod "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" (UID: "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.696637 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p" (OuterVolumeSpecName: "kube-api-access-vkq5p") pod "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" (UID: "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5"). InnerVolumeSpecName "kube-api-access-vkq5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.793601 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:33 crc kubenswrapper[4939]: I0318 16:55:33.793642 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkq5p\" (UniqueName: \"kubernetes.io/projected/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-kube-api-access-vkq5p\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.187007 4939 generic.go:334] "Generic (PLEG): container finished" podID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerID="de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90" exitCode=0 Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.187065 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerDied","Data":"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90"} Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.187134 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2r4tz" event={"ID":"ab8d2b65-6b99-4f6b-af6d-c94b9c392be5","Type":"ContainerDied","Data":"7b75a7ee4293b5341d7198c99ce94d3d23e0ceb03775bb24cf7c373a496677c1"} Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.187157 4939 scope.go:117] "RemoveContainer" containerID="de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.187185 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2r4tz" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.208487 4939 scope.go:117] "RemoveContainer" containerID="59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.227489 4939 scope.go:117] "RemoveContainer" containerID="85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.248966 4939 scope.go:117] "RemoveContainer" containerID="de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90" Mar 18 16:55:34 crc kubenswrapper[4939]: E0318 16:55:34.249479 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90\": container with ID starting with de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90 not found: ID does not exist" containerID="de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.249523 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90"} err="failed to get container status \"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90\": rpc error: code = NotFound desc = could not find container \"de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90\": container with ID starting with de4d44b4984a2cde568735c76e060a94793822c8be126424dd252c546a0a6a90 not found: ID does not exist" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.249544 4939 scope.go:117] "RemoveContainer" containerID="59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50" Mar 18 16:55:34 crc kubenswrapper[4939]: E0318 16:55:34.250130 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50\": container with ID starting with 59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50 not found: ID does not exist" containerID="59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.250166 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50"} err="failed to get container status \"59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50\": rpc error: code = NotFound desc = could not find container \"59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50\": container with ID starting with 59e6977bd568bcc318966f589a598a0b49934da25d9d27eaf528edeec2a98e50 not found: ID does not exist" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.250181 4939 scope.go:117] "RemoveContainer" containerID="85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a" Mar 18 16:55:34 crc kubenswrapper[4939]: E0318 16:55:34.250394 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a\": container with ID starting with 85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a not found: ID does not exist" containerID="85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.250412 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a"} err="failed to get container status \"85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a\": rpc error: code = NotFound desc = could not find container \"85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a\": container with ID starting with 85cc9c3ae4d676f41f651a6db2abeb21db129a7b92d3e7631805fc0b3ae3238a not found: ID does not exist" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.540468 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" (UID: "ab8d2b65-6b99-4f6b-af6d-c94b9c392be5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.602793 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.839035 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:34 crc kubenswrapper[4939]: I0318 16:55:34.847485 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2r4tz"] Mar 18 16:55:36 crc kubenswrapper[4939]: I0318 16:55:36.141085 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" path="/var/lib/kubelet/pods/ab8d2b65-6b99-4f6b-af6d-c94b9c392be5/volumes" Mar 18 16:55:47 crc kubenswrapper[4939]: I0318 16:55:47.133582 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:55:47 crc kubenswrapper[4939]: E0318 16:55:47.134165 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.146389 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564216-5pfct"] Mar 18 16:56:00 crc kubenswrapper[4939]: E0318 16:56:00.147213 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.147228 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4939]: E0318 16:56:00.147247 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="extract-content" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.147257 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="extract-content" Mar 18 16:56:00 crc kubenswrapper[4939]: E0318 16:56:00.147275 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="extract-utilities" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.147283 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="extract-utilities" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.147458 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8d2b65-6b99-4f6b-af6d-c94b9c392be5" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.147996 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.153070 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.153441 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.153552 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.155863 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-5pfct"] Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.175650 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwms6\" (UniqueName: \"kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6\") pod \"auto-csr-approver-29564216-5pfct\" (UID: \"d9a1a929-ec59-4086-aa65-2463bb2737f6\") " pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.278546 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwms6\" (UniqueName: \"kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6\") pod \"auto-csr-approver-29564216-5pfct\" (UID: \"d9a1a929-ec59-4086-aa65-2463bb2737f6\") " pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.317452 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwms6\" (UniqueName: \"kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6\") pod \"auto-csr-approver-29564216-5pfct\" (UID: \"d9a1a929-ec59-4086-aa65-2463bb2737f6\") " pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:00 crc kubenswrapper[4939]: I0318 16:56:00.486520 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:01 crc kubenswrapper[4939]: I0318 16:56:01.133277 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:56:01 crc kubenswrapper[4939]: E0318 16:56:01.133862 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:01 crc kubenswrapper[4939]: I0318 16:56:01.166797 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-5pfct"] Mar 18 16:56:01 crc kubenswrapper[4939]: I0318 16:56:01.371640 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-5pfct" event={"ID":"d9a1a929-ec59-4086-aa65-2463bb2737f6","Type":"ContainerStarted","Data":"372b0faa47bff25b250e8c41677b7f95e374d34d4d60674eccee6b06ef958b52"} Mar 18 16:56:03 crc kubenswrapper[4939]: I0318 16:56:03.389055 4939 generic.go:334] "Generic (PLEG): container finished" podID="d9a1a929-ec59-4086-aa65-2463bb2737f6" containerID="233f294246630890dae0d42b33a0b5295fa58cf9e1ced9dc22bdb392c649c8fe" exitCode=0 Mar 18 16:56:03 crc kubenswrapper[4939]: I0318 16:56:03.389163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-5pfct" event={"ID":"d9a1a929-ec59-4086-aa65-2463bb2737f6","Type":"ContainerDied","Data":"233f294246630890dae0d42b33a0b5295fa58cf9e1ced9dc22bdb392c649c8fe"} Mar 18 16:56:04 crc kubenswrapper[4939]: I0318 16:56:04.720038 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:04 crc kubenswrapper[4939]: I0318 16:56:04.853397 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwms6\" (UniqueName: \"kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6\") pod \"d9a1a929-ec59-4086-aa65-2463bb2737f6\" (UID: \"d9a1a929-ec59-4086-aa65-2463bb2737f6\") " Mar 18 16:56:04 crc kubenswrapper[4939]: I0318 16:56:04.859144 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6" (OuterVolumeSpecName: "kube-api-access-jwms6") pod "d9a1a929-ec59-4086-aa65-2463bb2737f6" (UID: "d9a1a929-ec59-4086-aa65-2463bb2737f6"). InnerVolumeSpecName "kube-api-access-jwms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:56:04 crc kubenswrapper[4939]: I0318 16:56:04.955899 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwms6\" (UniqueName: \"kubernetes.io/projected/d9a1a929-ec59-4086-aa65-2463bb2737f6-kube-api-access-jwms6\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:05 crc kubenswrapper[4939]: I0318 16:56:05.408462 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-5pfct" event={"ID":"d9a1a929-ec59-4086-aa65-2463bb2737f6","Type":"ContainerDied","Data":"372b0faa47bff25b250e8c41677b7f95e374d34d4d60674eccee6b06ef958b52"} Mar 18 16:56:05 crc kubenswrapper[4939]: I0318 16:56:05.408554 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372b0faa47bff25b250e8c41677b7f95e374d34d4d60674eccee6b06ef958b52" Mar 18 16:56:05 crc kubenswrapper[4939]: I0318 16:56:05.408580 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-5pfct" Mar 18 16:56:05 crc kubenswrapper[4939]: I0318 16:56:05.802097 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-8dmdc"] Mar 18 16:56:05 crc kubenswrapper[4939]: I0318 16:56:05.808733 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-8dmdc"] Mar 18 16:56:06 crc kubenswrapper[4939]: I0318 16:56:06.143861 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a12717b-5406-4aaa-989a-1de8b02fc9c7" path="/var/lib/kubelet/pods/5a12717b-5406-4aaa-989a-1de8b02fc9c7/volumes" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.063748 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-xgz5n"] Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.070759 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-xgz5n"] Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.192214 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5lfd2"] Mar 18 16:56:13 crc kubenswrapper[4939]: E0318 16:56:13.192612 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a1a929-ec59-4086-aa65-2463bb2737f6" containerName="oc" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.192637 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a1a929-ec59-4086-aa65-2463bb2737f6" containerName="oc" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.192820 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a1a929-ec59-4086-aa65-2463bb2737f6" containerName="oc" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.193423 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.195238 4939 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-44r9v" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.196524 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.198919 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.199608 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.201012 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5lfd2"] Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.373279 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.373345 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.373372 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bx2\" (UniqueName: \"kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.474513 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.474591 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.474621 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bx2\" (UniqueName: \"kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.474838 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.475834 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.498057 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bx2\" (UniqueName: \"kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2\") pod \"crc-storage-crc-5lfd2\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.510710 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:13 crc kubenswrapper[4939]: I0318 16:56:13.744818 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5lfd2"] Mar 18 16:56:14 crc kubenswrapper[4939]: I0318 16:56:14.142943 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af2ff35-a774-4708-b902-00fc5dad9c9e" path="/var/lib/kubelet/pods/4af2ff35-a774-4708-b902-00fc5dad9c9e/volumes" Mar 18 16:56:14 crc kubenswrapper[4939]: I0318 16:56:14.471025 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5lfd2" event={"ID":"12506055-cd66-44fa-8da6-fc509ba8027c","Type":"ContainerStarted","Data":"d97929909fcc0f0dc326092da44637dbca63cb24eb873afd807a43c166fa87eb"} Mar 18 16:56:15 crc kubenswrapper[4939]: I0318 16:56:15.133919 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:56:15 crc kubenswrapper[4939]: E0318 16:56:15.134170 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:16 crc kubenswrapper[4939]: I0318 16:56:16.487305 4939 generic.go:334] "Generic (PLEG): container finished" podID="12506055-cd66-44fa-8da6-fc509ba8027c" containerID="5d6464871921ab0029547e84e26c17e0bd6ade9dd7a11453edb46747ba7a8e76" exitCode=0 Mar 18 16:56:16 crc kubenswrapper[4939]: I0318 16:56:16.487390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5lfd2" event={"ID":"12506055-cd66-44fa-8da6-fc509ba8027c","Type":"ContainerDied","Data":"5d6464871921ab0029547e84e26c17e0bd6ade9dd7a11453edb46747ba7a8e76"} Mar 18 16:56:17 crc kubenswrapper[4939]: I0318 16:56:17.835369 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.036436 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage\") pod \"12506055-cd66-44fa-8da6-fc509ba8027c\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.036593 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt\") pod \"12506055-cd66-44fa-8da6-fc509ba8027c\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.036653 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2bx2\" (UniqueName: \"kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2\") pod \"12506055-cd66-44fa-8da6-fc509ba8027c\" (UID: \"12506055-cd66-44fa-8da6-fc509ba8027c\") " Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.036729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "12506055-cd66-44fa-8da6-fc509ba8027c" (UID: "12506055-cd66-44fa-8da6-fc509ba8027c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.036980 4939 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/12506055-cd66-44fa-8da6-fc509ba8027c-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.041675 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2" (OuterVolumeSpecName: "kube-api-access-d2bx2") pod "12506055-cd66-44fa-8da6-fc509ba8027c" (UID: "12506055-cd66-44fa-8da6-fc509ba8027c"). InnerVolumeSpecName "kube-api-access-d2bx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.055174 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "12506055-cd66-44fa-8da6-fc509ba8027c" (UID: "12506055-cd66-44fa-8da6-fc509ba8027c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.137869 4939 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/12506055-cd66-44fa-8da6-fc509ba8027c-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.137912 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2bx2\" (UniqueName: \"kubernetes.io/projected/12506055-cd66-44fa-8da6-fc509ba8027c-kube-api-access-d2bx2\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.501689 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5lfd2" event={"ID":"12506055-cd66-44fa-8da6-fc509ba8027c","Type":"ContainerDied","Data":"d97929909fcc0f0dc326092da44637dbca63cb24eb873afd807a43c166fa87eb"} Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.501742 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97929909fcc0f0dc326092da44637dbca63cb24eb873afd807a43c166fa87eb" Mar 18 16:56:18 crc kubenswrapper[4939]: I0318 16:56:18.501707 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5lfd2" Mar 18 16:56:19 crc kubenswrapper[4939]: I0318 16:56:19.943138 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-5lfd2"] Mar 18 16:56:19 crc kubenswrapper[4939]: I0318 16:56:19.949455 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-5lfd2"] Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.099463 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dzk9j"] Mar 18 16:56:20 crc kubenswrapper[4939]: E0318 16:56:20.099834 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12506055-cd66-44fa-8da6-fc509ba8027c" containerName="storage" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.099855 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="12506055-cd66-44fa-8da6-fc509ba8027c" containerName="storage" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.100027 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="12506055-cd66-44fa-8da6-fc509ba8027c" containerName="storage" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.100542 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.103773 4939 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-44r9v" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.104296 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.104555 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.110068 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dzk9j"] Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.116908 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.143153 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12506055-cd66-44fa-8da6-fc509ba8027c" path="/var/lib/kubelet/pods/12506055-cd66-44fa-8da6-fc509ba8027c/volumes" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.162879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.162996 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95q6\" (UniqueName: \"kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.163119 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.264221 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.264308 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n95q6\" (UniqueName: \"kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.264348 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.264635 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.265159 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.286042 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95q6\" (UniqueName: \"kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6\") pod \"crc-storage-crc-dzk9j\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.418333 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:20 crc kubenswrapper[4939]: I0318 16:56:20.832100 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dzk9j"] Mar 18 16:56:21 crc kubenswrapper[4939]: I0318 16:56:21.526083 4939 generic.go:334] "Generic (PLEG): container finished" podID="93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" containerID="bb722a27c2eb8774524b6cde4c4da7e76216e8349716b1aab3c435eae1e71450" exitCode=0 Mar 18 16:56:21 crc kubenswrapper[4939]: I0318 16:56:21.526365 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dzk9j" event={"ID":"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f","Type":"ContainerDied","Data":"bb722a27c2eb8774524b6cde4c4da7e76216e8349716b1aab3c435eae1e71450"} Mar 18 16:56:21 crc kubenswrapper[4939]: I0318 16:56:21.526426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dzk9j" event={"ID":"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f","Type":"ContainerStarted","Data":"5636fe0cf06ad271b2bc7081366e91a93dba644c9ce7f8cea3c0ec337761732f"} Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.810003 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.914948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n95q6\" (UniqueName: \"kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6\") pod \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.914990 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt\") pod \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.915016 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage\") pod \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\" (UID: \"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f\") " Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.915099 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" (UID: "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.915413 4939 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.919934 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6" (OuterVolumeSpecName: "kube-api-access-n95q6") pod "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" (UID: "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f"). InnerVolumeSpecName "kube-api-access-n95q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:56:22 crc kubenswrapper[4939]: I0318 16:56:22.931818 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" (UID: "93afdd2a-fc61-4eaa-92ca-8dc224a9a52f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:56:23 crc kubenswrapper[4939]: I0318 16:56:23.016868 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n95q6\" (UniqueName: \"kubernetes.io/projected/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-kube-api-access-n95q6\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:23 crc kubenswrapper[4939]: I0318 16:56:23.016901 4939 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/93afdd2a-fc61-4eaa-92ca-8dc224a9a52f-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:23 crc kubenswrapper[4939]: I0318 16:56:23.544097 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dzk9j" event={"ID":"93afdd2a-fc61-4eaa-92ca-8dc224a9a52f","Type":"ContainerDied","Data":"5636fe0cf06ad271b2bc7081366e91a93dba644c9ce7f8cea3c0ec337761732f"} Mar 18 16:56:23 crc kubenswrapper[4939]: I0318 16:56:23.544142 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5636fe0cf06ad271b2bc7081366e91a93dba644c9ce7f8cea3c0ec337761732f" Mar 18 16:56:23 crc kubenswrapper[4939]: I0318 16:56:23.544197 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dzk9j" Mar 18 16:56:30 crc kubenswrapper[4939]: I0318 16:56:30.133821 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:56:30 crc kubenswrapper[4939]: E0318 16:56:30.134620 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:42 crc kubenswrapper[4939]: I0318 16:56:42.135267 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:56:42 crc kubenswrapper[4939]: E0318 16:56:42.136121 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.027430 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:56:52 crc kubenswrapper[4939]: E0318 16:56:52.028457 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" containerName="storage" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.028477 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" containerName="storage" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.028766 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="93afdd2a-fc61-4eaa-92ca-8dc224a9a52f" containerName="storage" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.030339 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.037399 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.146754 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrcdr\" (UniqueName: \"kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.146796 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.146828 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.247972 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrcdr\" (UniqueName: \"kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.248013 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.248049 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.248550 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.248675 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.278489 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrcdr\" (UniqueName: \"kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr\") pod \"community-operators-9hr4c\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.350206 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:56:52 crc kubenswrapper[4939]: I0318 16:56:52.799990 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:56:53 crc kubenswrapper[4939]: I0318 16:56:53.133283 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:56:53 crc kubenswrapper[4939]: E0318 16:56:53.133922 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:56:53 crc kubenswrapper[4939]: I0318 16:56:53.768996 4939 generic.go:334] "Generic (PLEG): container finished" podID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerID="257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641" exitCode=0 Mar 18 16:56:53 crc kubenswrapper[4939]: I0318 16:56:53.769047 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerDied","Data":"257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641"} Mar 18 16:56:53 crc kubenswrapper[4939]: I0318 16:56:53.769078 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerStarted","Data":"ee914d8d34881c1a1d1d02bfaf0512f3f12b9a72567d991f0ad3190418fce0c7"} Mar 18 16:56:54 crc kubenswrapper[4939]: I0318 16:56:54.795695 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerStarted","Data":"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af"} Mar 18 16:56:55 crc kubenswrapper[4939]: I0318 16:56:55.808177 4939 generic.go:334] "Generic (PLEG): container finished" podID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerID="c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af" exitCode=0 Mar 18 16:56:55 crc kubenswrapper[4939]: I0318 16:56:55.808481 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerDied","Data":"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af"} Mar 18 16:56:57 crc kubenswrapper[4939]: I0318 16:56:57.829927 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerStarted","Data":"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b"} Mar 18 16:56:57 crc kubenswrapper[4939]: I0318 16:56:57.850311 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hr4c" podStartSLOduration=3.369404891 podStartE2EDuration="5.850289169s" podCreationTimestamp="2026-03-18 16:56:52 +0000 UTC" firstStartedPulling="2026-03-18 16:56:53.770528607 +0000 UTC m=+4778.369716228" lastFinishedPulling="2026-03-18 16:56:56.251412885 +0000 UTC m=+4780.850600506" observedRunningTime="2026-03-18 16:56:57.849196788 +0000 UTC m=+4782.448384419" watchObservedRunningTime="2026-03-18 16:56:57.850289169 +0000 UTC m=+4782.449476790" Mar 18 16:56:59 crc kubenswrapper[4939]: I0318 16:56:59.819133 4939 scope.go:117] "RemoveContainer" containerID="1bf5f1e0a8e3cdd552733f46ed0538c55bd5977361fa8954d8df52e1800b7f9b" Mar 18 16:56:59 crc kubenswrapper[4939]: I0318 16:56:59.842461 4939 scope.go:117] "RemoveContainer" containerID="b2f74d03993074196d50541769f03729c3d00b1010e795e87abde64a8db9448e" Mar 18 16:57:02 crc kubenswrapper[4939]: I0318 16:57:02.350530 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:02 crc kubenswrapper[4939]: I0318 16:57:02.350860 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:02 crc kubenswrapper[4939]: I0318 16:57:02.409690 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:02 crc kubenswrapper[4939]: I0318 16:57:02.906919 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:02 crc kubenswrapper[4939]: I0318 16:57:02.963739 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:57:04 crc kubenswrapper[4939]: I0318 16:57:04.952003 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hr4c" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="registry-server" containerID="cri-o://32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b" gracePeriod=2 Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.624086 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.764450 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities\") pod \"3f3dd01d-2ea4-4e96-b809-93540b14853d\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.764622 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content\") pod \"3f3dd01d-2ea4-4e96-b809-93540b14853d\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.764702 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrcdr\" (UniqueName: \"kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr\") pod \"3f3dd01d-2ea4-4e96-b809-93540b14853d\" (UID: \"3f3dd01d-2ea4-4e96-b809-93540b14853d\") " Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.765286 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities" (OuterVolumeSpecName: "utilities") pod "3f3dd01d-2ea4-4e96-b809-93540b14853d" (UID: "3f3dd01d-2ea4-4e96-b809-93540b14853d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.771040 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr" (OuterVolumeSpecName: "kube-api-access-vrcdr") pod "3f3dd01d-2ea4-4e96-b809-93540b14853d" (UID: "3f3dd01d-2ea4-4e96-b809-93540b14853d"). InnerVolumeSpecName "kube-api-access-vrcdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.817956 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f3dd01d-2ea4-4e96-b809-93540b14853d" (UID: "3f3dd01d-2ea4-4e96-b809-93540b14853d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.866648 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.866700 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrcdr\" (UniqueName: \"kubernetes.io/projected/3f3dd01d-2ea4-4e96-b809-93540b14853d-kube-api-access-vrcdr\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.866728 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3dd01d-2ea4-4e96-b809-93540b14853d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.961466 4939 generic.go:334] "Generic (PLEG): container finished" podID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerID="32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b" exitCode=0 Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.961552 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hr4c" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.961537 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerDied","Data":"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b"} Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.961680 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hr4c" event={"ID":"3f3dd01d-2ea4-4e96-b809-93540b14853d","Type":"ContainerDied","Data":"ee914d8d34881c1a1d1d02bfaf0512f3f12b9a72567d991f0ad3190418fce0c7"} Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.961699 4939 scope.go:117] "RemoveContainer" containerID="32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.980655 4939 scope.go:117] "RemoveContainer" containerID="c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af" Mar 18 16:57:05 crc kubenswrapper[4939]: I0318 16:57:05.997849 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.008331 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hr4c"] Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.018400 4939 scope.go:117] "RemoveContainer" containerID="257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.033517 4939 scope.go:117] "RemoveContainer" containerID="32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b" Mar 18 16:57:06 crc kubenswrapper[4939]: E0318 16:57:06.033957 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b\": container with ID starting with 32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b not found: ID does not exist" containerID="32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.034020 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b"} err="failed to get container status \"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b\": rpc error: code = NotFound desc = could not find container \"32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b\": container with ID starting with 32570a91151d7e34b683aa4cfa1e5a3ef9082f5b0467c2bffc2e5d0ba2fe444b not found: ID does not exist" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.034053 4939 scope.go:117] "RemoveContainer" containerID="c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af" Mar 18 16:57:06 crc kubenswrapper[4939]: E0318 16:57:06.034459 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af\": container with ID starting with c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af not found: ID does not exist" containerID="c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.034533 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af"} err="failed to get container status \"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af\": rpc error: code = NotFound desc = could not find container \"c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af\": container with ID starting with c3994cca99634cb1ec56a6dab03c95dafd9dc21984f67573a2df63fd0599d1af not found: ID does not exist" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.034566 4939 scope.go:117] "RemoveContainer" containerID="257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641" Mar 18 16:57:06 crc kubenswrapper[4939]: E0318 16:57:06.034919 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641\": container with ID starting with 257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641 not found: ID does not exist" containerID="257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.034941 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641"} err="failed to get container status \"257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641\": rpc error: code = NotFound desc = could not find container \"257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641\": container with ID starting with 257d7233271178a008d3e0175ab51cf6a254dea5d6369fed0525cf3beb449641 not found: ID does not exist" Mar 18 16:57:06 crc kubenswrapper[4939]: I0318 16:57:06.141382 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" path="/var/lib/kubelet/pods/3f3dd01d-2ea4-4e96-b809-93540b14853d/volumes" Mar 18 16:57:08 crc kubenswrapper[4939]: I0318 16:57:08.133563 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:57:08 crc kubenswrapper[4939]: E0318 16:57:08.135094 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:57:20 crc kubenswrapper[4939]: I0318 16:57:20.133696 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:57:20 crc kubenswrapper[4939]: E0318 16:57:20.134770 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:57:32 crc kubenswrapper[4939]: I0318 16:57:32.133183 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:57:32 crc kubenswrapper[4939]: E0318 16:57:32.134002 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:57:46 crc kubenswrapper[4939]: I0318 16:57:46.140382 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:57:46 crc kubenswrapper[4939]: E0318 16:57:46.141339 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.135874 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.163583 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564218-xsnw5"] Mar 18 16:58:00 crc kubenswrapper[4939]: E0318 16:58:00.163898 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.163915 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4939]: E0318 16:58:00.163928 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.163935 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4939]: E0318 16:58:00.163947 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.163952 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.164108 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3dd01d-2ea4-4e96-b809-93540b14853d" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.164698 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.168196 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.168620 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.169462 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.176268 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-xsnw5"] Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.340730 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dck\" (UniqueName: \"kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck\") pod \"auto-csr-approver-29564218-xsnw5\" (UID: \"e13c38bf-7e9d-4280-964b-0ca16f37c246\") " pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.379864 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5"} Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.442816 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dck\" (UniqueName: \"kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck\") pod \"auto-csr-approver-29564218-xsnw5\" (UID: \"e13c38bf-7e9d-4280-964b-0ca16f37c246\") " pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.464936 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dck\" (UniqueName: \"kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck\") pod \"auto-csr-approver-29564218-xsnw5\" (UID: \"e13c38bf-7e9d-4280-964b-0ca16f37c246\") " pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.492781 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:00 crc kubenswrapper[4939]: I0318 16:58:00.919557 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-xsnw5"] Mar 18 16:58:01 crc kubenswrapper[4939]: I0318 16:58:01.386001 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" event={"ID":"e13c38bf-7e9d-4280-964b-0ca16f37c246","Type":"ContainerStarted","Data":"c64ba8aa0a21c83370f9abe28bf202e9bf86b4d60e4187b7a3465b0af1a862bf"} Mar 18 16:58:03 crc kubenswrapper[4939]: I0318 16:58:03.403804 4939 generic.go:334] "Generic (PLEG): container finished" podID="e13c38bf-7e9d-4280-964b-0ca16f37c246" containerID="5c1c9d470cc2cca96fc76b77ed69681366aefd62b2ab3f48372cee0aa9be214f" exitCode=0 Mar 18 16:58:03 crc kubenswrapper[4939]: I0318 16:58:03.403909 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" event={"ID":"e13c38bf-7e9d-4280-964b-0ca16f37c246","Type":"ContainerDied","Data":"5c1c9d470cc2cca96fc76b77ed69681366aefd62b2ab3f48372cee0aa9be214f"} Mar 18 16:58:04 crc kubenswrapper[4939]: I0318 16:58:04.668418 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:04 crc kubenswrapper[4939]: I0318 16:58:04.699228 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5dck\" (UniqueName: \"kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck\") pod \"e13c38bf-7e9d-4280-964b-0ca16f37c246\" (UID: \"e13c38bf-7e9d-4280-964b-0ca16f37c246\") " Mar 18 16:58:04 crc kubenswrapper[4939]: I0318 16:58:04.709710 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck" (OuterVolumeSpecName: "kube-api-access-q5dck") pod "e13c38bf-7e9d-4280-964b-0ca16f37c246" (UID: "e13c38bf-7e9d-4280-964b-0ca16f37c246"). InnerVolumeSpecName "kube-api-access-q5dck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:58:04 crc kubenswrapper[4939]: I0318 16:58:04.801448 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5dck\" (UniqueName: \"kubernetes.io/projected/e13c38bf-7e9d-4280-964b-0ca16f37c246-kube-api-access-q5dck\") on node \"crc\" DevicePath \"\"" Mar 18 16:58:05 crc kubenswrapper[4939]: I0318 16:58:05.418903 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" event={"ID":"e13c38bf-7e9d-4280-964b-0ca16f37c246","Type":"ContainerDied","Data":"c64ba8aa0a21c83370f9abe28bf202e9bf86b4d60e4187b7a3465b0af1a862bf"} Mar 18 16:58:05 crc kubenswrapper[4939]: I0318 16:58:05.419207 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64ba8aa0a21c83370f9abe28bf202e9bf86b4d60e4187b7a3465b0af1a862bf" Mar 18 16:58:05 crc kubenswrapper[4939]: I0318 16:58:05.418951 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-xsnw5" Mar 18 16:58:05 crc kubenswrapper[4939]: I0318 16:58:05.743145 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-z29g9"] Mar 18 16:58:05 crc kubenswrapper[4939]: I0318 16:58:05.748376 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-z29g9"] Mar 18 16:58:06 crc kubenswrapper[4939]: I0318 16:58:06.143001 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc634fd1-df6f-4086-95b7-d1b644c048d0" path="/var/lib/kubelet/pods/fc634fd1-df6f-4086-95b7-d1b644c048d0/volumes" Mar 18 16:58:59 crc kubenswrapper[4939]: I0318 16:58:59.939747 4939 scope.go:117] "RemoveContainer" containerID="1adf7f728affe4d8b9fda3dc805ce891fed174baab991b3882d96b94bd54e10a" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.296880 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:36 crc kubenswrapper[4939]: E0318 16:59:36.297753 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13c38bf-7e9d-4280-964b-0ca16f37c246" containerName="oc" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.297770 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13c38bf-7e9d-4280-964b-0ca16f37c246" containerName="oc" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.297928 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13c38bf-7e9d-4280-964b-0ca16f37c246" containerName="oc" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.298814 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.302672 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.302856 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.311445 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.315221 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.315790 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.316973 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dpqsn" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.388223 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q472j\" (UniqueName: \"kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.388292 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.388348 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.489132 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.489231 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q472j\" (UniqueName: \"kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.489268 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.490867 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.490997 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.534496 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q472j\" (UniqueName: \"kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j\") pod \"dnsmasq-dns-5d7b5456f5-p5tvs\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.603192 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.604730 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.615811 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.635573 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.692402 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.692465 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.692590 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58sv9\" (UniqueName: \"kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.794724 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58sv9\" (UniqueName: \"kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.794869 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.794946 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.796073 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.797565 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.837786 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58sv9\" (UniqueName: \"kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9\") pod \"dnsmasq-dns-98ddfc8f-fxzkl\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:36 crc kubenswrapper[4939]: I0318 16:59:36.927023 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.179572 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.417004 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.438562 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.440605 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.442791 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.442819 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.442837 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fjptp" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.442967 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.444301 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.456277 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.506994 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507030 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507048 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507078 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507263 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507346 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507438 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507487 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6jj\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.507524 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.608888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.608973 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609099 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609188 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609376 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6jj\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.609842 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.610448 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.610453 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.611888 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.612683 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.613249 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.613389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.613567 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.613597 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6078a2acfc90786553dde1ebd9b0d0a94de325ff855324a9a16d94502664ff27/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.620946 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.639896 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6jj\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.665606 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.787678 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.788962 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.791049 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.791334 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.791631 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5njsn" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.792820 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.793805 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.812183 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.839969 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916354 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916731 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916773 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916852 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wc7\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916878 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916916 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916945 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.916978 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:37 crc kubenswrapper[4939]: I0318 16:59:37.917018 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018172 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018236 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018313 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018346 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018373 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018406 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018466 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wc7\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.018496 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.019567 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.020215 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.021676 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.022928 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.024075 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.024297 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.024621 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.024649 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.024679 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f58582c94789db6d268fd2ad4cad73469dce675c14a8bd7b9504d878706640d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.035652 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wc7\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.074839 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.106876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.120282 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a2757a9-9f09-42f0-aa97-114639391b72" containerID="fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11" exitCode=0 Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.120368 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" event={"ID":"5a2757a9-9f09-42f0-aa97-114639391b72","Type":"ContainerDied","Data":"fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11"} Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.120393 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" event={"ID":"5a2757a9-9f09-42f0-aa97-114639391b72","Type":"ContainerStarted","Data":"60a8315682e266497711a88b5ad1f590dc3a5c96bdd2c66f5f37cbc6a90bf6d1"} Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.126028 4939 generic.go:334] "Generic (PLEG): container finished" podID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerID="89b08e529ab7f1fb2239ef3a522cebf6ef8f3d11aa6a7e4722a2364b7baaaecf" exitCode=0 Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.126069 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" event={"ID":"c3d05697-55c9-4f3b-af58-79afb3af5683","Type":"ContainerDied","Data":"89b08e529ab7f1fb2239ef3a522cebf6ef8f3d11aa6a7e4722a2364b7baaaecf"} Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.126094 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" event={"ID":"c3d05697-55c9-4f3b-af58-79afb3af5683","Type":"ContainerStarted","Data":"86409aa3786374bebda5ce7104bace04a938fed6614cdcc95fe7e0119af8a95e"} Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.267957 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.268900 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.274047 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.274244 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2fjcd" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.280530 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.289966 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.292289 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qhmzh" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.292695 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.292942 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.294249 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.297969 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.298471 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.304397 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.329966 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t96w7\" (UniqueName: \"kubernetes.io/projected/c2e610ad-5be1-4333-b786-4883d87fedaf-kube-api-access-t96w7\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.330039 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-config-data\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.330107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-kolla-config\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.401920 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433337 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433414 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433455 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t96w7\" (UniqueName: \"kubernetes.io/projected/c2e610ad-5be1-4333-b786-4883d87fedaf-kube-api-access-t96w7\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433485 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433568 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433596 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-config-data\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433638 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrsnb\" (UniqueName: \"kubernetes.io/projected/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kube-api-access-lrsnb\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433740 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433784 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433829 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-kolla-config\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.433873 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.435274 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-config-data\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.435997 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2e610ad-5be1-4333-b786-4883d87fedaf-kolla-config\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.453668 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t96w7\" (UniqueName: \"kubernetes.io/projected/c2e610ad-5be1-4333-b786-4883d87fedaf-kube-api-access-t96w7\") pod \"memcached-0\" (UID: \"c2e610ad-5be1-4333-b786-4883d87fedaf\") " pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538403 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538435 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538457 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538483 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538537 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538577 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.538617 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrsnb\" (UniqueName: \"kubernetes.io/projected/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kube-api-access-lrsnb\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.539428 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.540735 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.541315 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.541484 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.544158 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.549716 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68254861ec7658bc65aaa3e17dda45530068e74bf0c147c377984b04974623a2/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.550291 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.552126 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.582648 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrsnb\" (UniqueName: \"kubernetes.io/projected/a3f6a5c2-628f-4a42-b8d6-59bd4535dba5-kube-api-access-lrsnb\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.603478 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c22c0a8-0111-4797-808a-4ea09de35d00\") pod \"openstack-galera-0\" (UID: \"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5\") " pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.615483 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.619453 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:59:38 crc kubenswrapper[4939]: I0318 16:59:38.634140 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 16:59:38 crc kubenswrapper[4939]: E0318 16:59:38.774021 4939 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 18 16:59:38 crc kubenswrapper[4939]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5a2757a9-9f09-42f0-aa97-114639391b72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 16:59:38 crc kubenswrapper[4939]: > podSandboxID="60a8315682e266497711a88b5ad1f590dc3a5c96bdd2c66f5f37cbc6a90bf6d1" Mar 18 16:59:38 crc kubenswrapper[4939]: E0318 16:59:38.774584 4939 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:59:38 crc kubenswrapper[4939]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q472j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-p5tvs_openstack(5a2757a9-9f09-42f0-aa97-114639391b72): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5a2757a9-9f09-42f0-aa97-114639391b72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 16:59:38 crc kubenswrapper[4939]: > logger="UnhandledError" Mar 18 16:59:38 crc kubenswrapper[4939]: E0318 16:59:38.775755 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5a2757a9-9f09-42f0-aa97-114639391b72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.080963 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.137663 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerStarted","Data":"689c480cdabd0ff64befc721469e3e2aeb3be9badf3cc9b020befe5809dd07fa"} Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.140465 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2e610ad-5be1-4333-b786-4883d87fedaf","Type":"ContainerStarted","Data":"8f62eeec67e766e09f48a4b5850e5f47d1b8522385bfe28278146cfe4ca179dd"} Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.141728 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerStarted","Data":"9ac5a4f6c3ab500b0872c29f72f1827fca46c565bda6de290dffd25b95fa8b83"} Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.145093 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" event={"ID":"c3d05697-55c9-4f3b-af58-79afb3af5683","Type":"ContainerStarted","Data":"93a35d0ed6476b2c59b6a8982d411ed7a96e7633345ee7a537dab200d353e489"} Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.146001 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.155480 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 16:59:39 crc kubenswrapper[4939]: W0318 16:59:39.184267 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3f6a5c2_628f_4a42_b8d6_59bd4535dba5.slice/crio-860ac90a43b1c753fd4b9e213ab1e16b7679fd0fafccdcaceac8189a2fa35d34 WatchSource:0}: Error finding container 860ac90a43b1c753fd4b9e213ab1e16b7679fd0fafccdcaceac8189a2fa35d34: Status 404 returned error can't find the container with id 860ac90a43b1c753fd4b9e213ab1e16b7679fd0fafccdcaceac8189a2fa35d34 Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.193904 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.195043 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.206384 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.211652 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7pj97" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.211864 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.218744 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.225783 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.226676 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" podStartSLOduration=3.226654218 podStartE2EDuration="3.226654218s" podCreationTimestamp="2026-03-18 16:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:39.187350892 +0000 UTC m=+4943.786538513" watchObservedRunningTime="2026-03-18 16:59:39.226654218 +0000 UTC m=+4943.825841849" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352701 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7jc\" (UniqueName: \"kubernetes.io/projected/339c0a2d-5c97-4d3b-84d7-a8731c708236-kube-api-access-hz7jc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352882 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352928 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352964 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.352984 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.353013 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.353029 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455158 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455273 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455328 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455400 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455428 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.455480 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.456063 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.456565 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.456762 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.457524 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7jc\" (UniqueName: \"kubernetes.io/projected/339c0a2d-5c97-4d3b-84d7-a8731c708236-kube-api-access-hz7jc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.457647 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.458547 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.458750 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/339c0a2d-5c97-4d3b-84d7-a8731c708236-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.460753 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339c0a2d-5c97-4d3b-84d7-a8731c708236-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.461719 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.461747 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3aad78b59b3693c86f16c2f4da5ad117008f665e175c89fa2e3200cc952f68e6/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.570617 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7jc\" (UniqueName: \"kubernetes.io/projected/339c0a2d-5c97-4d3b-84d7-a8731c708236-kube-api-access-hz7jc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.678156 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33ad9440-2f6c-4a29-b256-dca11f5bf1fc\") pod \"openstack-cell1-galera-0\" (UID: \"339c0a2d-5c97-4d3b-84d7-a8731c708236\") " pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:39 crc kubenswrapper[4939]: I0318 16:59:39.840585 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.155094 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" event={"ID":"5a2757a9-9f09-42f0-aa97-114639391b72","Type":"ContainerStarted","Data":"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.155299 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.158325 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5","Type":"ContainerStarted","Data":"714636759bfc5db41fad14982902279dfd4dada02b972bc350ca3c3b346726b4"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.158376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5","Type":"ContainerStarted","Data":"860ac90a43b1c753fd4b9e213ab1e16b7679fd0fafccdcaceac8189a2fa35d34"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.160472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerStarted","Data":"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.163467 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2e610ad-5be1-4333-b786-4883d87fedaf","Type":"ContainerStarted","Data":"89ef3a5bff17a1dcbb4736b0b40245f02c5015ec0822639fa162fe16eb4904b1"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.163577 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.165198 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerStarted","Data":"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809"} Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.185016 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" podStartSLOduration=4.184991996 podStartE2EDuration="4.184991996s" podCreationTimestamp="2026-03-18 16:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:40.176432523 +0000 UTC m=+4944.775620144" watchObservedRunningTime="2026-03-18 16:59:40.184991996 +0000 UTC m=+4944.784179617" Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.282782 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.282763141 podStartE2EDuration="2.282763141s" podCreationTimestamp="2026-03-18 16:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:40.280763534 +0000 UTC m=+4944.879951165" watchObservedRunningTime="2026-03-18 16:59:40.282763141 +0000 UTC m=+4944.881950762" Mar 18 16:59:40 crc kubenswrapper[4939]: I0318 16:59:40.300372 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 16:59:41 crc kubenswrapper[4939]: I0318 16:59:41.175405 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"339c0a2d-5c97-4d3b-84d7-a8731c708236","Type":"ContainerStarted","Data":"70ee65657213af002e74d8846645e5bb65e7539c8c97b8dafbcfc9c894265670"} Mar 18 16:59:41 crc kubenswrapper[4939]: I0318 16:59:41.176341 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"339c0a2d-5c97-4d3b-84d7-a8731c708236","Type":"ContainerStarted","Data":"cb9fe31ca77beb53a2a4e0a9dfdadab004fdd7bdb1ef895848a56afda2303f46"} Mar 18 16:59:43 crc kubenswrapper[4939]: I0318 16:59:43.189609 4939 generic.go:334] "Generic (PLEG): container finished" podID="a3f6a5c2-628f-4a42-b8d6-59bd4535dba5" containerID="714636759bfc5db41fad14982902279dfd4dada02b972bc350ca3c3b346726b4" exitCode=0 Mar 18 16:59:43 crc kubenswrapper[4939]: I0318 16:59:43.189688 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5","Type":"ContainerDied","Data":"714636759bfc5db41fad14982902279dfd4dada02b972bc350ca3c3b346726b4"} Mar 18 16:59:44 crc kubenswrapper[4939]: I0318 16:59:44.202260 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3f6a5c2-628f-4a42-b8d6-59bd4535dba5","Type":"ContainerStarted","Data":"cb84280ed81c7ae345a935fd8100c9ec42b30d2f902b065801c37df9964279bb"} Mar 18 16:59:44 crc kubenswrapper[4939]: I0318 16:59:44.225445 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.225423185 podStartE2EDuration="7.225423185s" podCreationTimestamp="2026-03-18 16:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:44.225315912 +0000 UTC m=+4948.824503533" watchObservedRunningTime="2026-03-18 16:59:44.225423185 +0000 UTC m=+4948.824610806" Mar 18 16:59:45 crc kubenswrapper[4939]: I0318 16:59:45.214890 4939 generic.go:334] "Generic (PLEG): container finished" podID="339c0a2d-5c97-4d3b-84d7-a8731c708236" containerID="70ee65657213af002e74d8846645e5bb65e7539c8c97b8dafbcfc9c894265670" exitCode=0 Mar 18 16:59:45 crc kubenswrapper[4939]: I0318 16:59:45.214935 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"339c0a2d-5c97-4d3b-84d7-a8731c708236","Type":"ContainerDied","Data":"70ee65657213af002e74d8846645e5bb65e7539c8c97b8dafbcfc9c894265670"} Mar 18 16:59:46 crc kubenswrapper[4939]: I0318 16:59:46.226759 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"339c0a2d-5c97-4d3b-84d7-a8731c708236","Type":"ContainerStarted","Data":"e3ce55ff65c2436469d576090ea3692c4de3fa00774ffc2b565907968cb4cd68"} Mar 18 16:59:46 crc kubenswrapper[4939]: I0318 16:59:46.256808 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.256776507 podStartE2EDuration="8.256776507s" podCreationTimestamp="2026-03-18 16:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:46.249082348 +0000 UTC m=+4950.848269969" watchObservedRunningTime="2026-03-18 16:59:46.256776507 +0000 UTC m=+4950.855964168" Mar 18 16:59:46 crc kubenswrapper[4939]: E0318 16:59:46.386641 4939 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.227:57322->38.102.83.227:41597: write tcp 38.102.83.227:57322->38.102.83.227:41597: write: connection reset by peer Mar 18 16:59:46 crc kubenswrapper[4939]: I0318 16:59:46.617648 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:46 crc kubenswrapper[4939]: I0318 16:59:46.928422 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 16:59:46 crc kubenswrapper[4939]: I0318 16:59:46.984301 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.247097 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="dnsmasq-dns" containerID="cri-o://5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5" gracePeriod=10 Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.685943 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.780405 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc\") pod \"5a2757a9-9f09-42f0-aa97-114639391b72\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.780579 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config\") pod \"5a2757a9-9f09-42f0-aa97-114639391b72\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.780611 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q472j\" (UniqueName: \"kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j\") pod \"5a2757a9-9f09-42f0-aa97-114639391b72\" (UID: \"5a2757a9-9f09-42f0-aa97-114639391b72\") " Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.788542 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j" (OuterVolumeSpecName: "kube-api-access-q472j") pod "5a2757a9-9f09-42f0-aa97-114639391b72" (UID: "5a2757a9-9f09-42f0-aa97-114639391b72"). InnerVolumeSpecName "kube-api-access-q472j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.817080 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a2757a9-9f09-42f0-aa97-114639391b72" (UID: "5a2757a9-9f09-42f0-aa97-114639391b72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.822010 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config" (OuterVolumeSpecName: "config") pod "5a2757a9-9f09-42f0-aa97-114639391b72" (UID: "5a2757a9-9f09-42f0-aa97-114639391b72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.881641 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.881666 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q472j\" (UniqueName: \"kubernetes.io/projected/5a2757a9-9f09-42f0-aa97-114639391b72-kube-api-access-q472j\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:47 crc kubenswrapper[4939]: I0318 16:59:47.881676 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a2757a9-9f09-42f0-aa97-114639391b72-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.255183 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a2757a9-9f09-42f0-aa97-114639391b72" containerID="5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5" exitCode=0 Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.255225 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" event={"ID":"5a2757a9-9f09-42f0-aa97-114639391b72","Type":"ContainerDied","Data":"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5"} Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.255239 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.255250 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-p5tvs" event={"ID":"5a2757a9-9f09-42f0-aa97-114639391b72","Type":"ContainerDied","Data":"60a8315682e266497711a88b5ad1f590dc3a5c96bdd2c66f5f37cbc6a90bf6d1"} Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.255265 4939 scope.go:117] "RemoveContainer" containerID="5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.272419 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.276723 4939 scope.go:117] "RemoveContainer" containerID="fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.282075 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-p5tvs"] Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.296405 4939 scope.go:117] "RemoveContainer" containerID="5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5" Mar 18 16:59:48 crc kubenswrapper[4939]: E0318 16:59:48.296860 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5\": container with ID starting with 5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5 not found: ID does not exist" containerID="5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.296898 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5"} err="failed to get container status \"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5\": rpc error: code = NotFound desc = could not find container \"5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5\": container with ID starting with 5c44a0a4899b2c6b2c1184449628118362ee5ca59922ff0d74b894222da2b5f5 not found: ID does not exist" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.296923 4939 scope.go:117] "RemoveContainer" containerID="fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11" Mar 18 16:59:48 crc kubenswrapper[4939]: E0318 16:59:48.297230 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11\": container with ID starting with fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11 not found: ID does not exist" containerID="fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.297271 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11"} err="failed to get container status \"fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11\": rpc error: code = NotFound desc = could not find container \"fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11\": container with ID starting with fe5320a02086a8e3deb01676a0c4f1b3cfc41f6fd1277a6895b277ade7db5f11 not found: ID does not exist" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.616677 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.634863 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 16:59:48 crc kubenswrapper[4939]: I0318 16:59:48.634909 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 16:59:49 crc kubenswrapper[4939]: I0318 16:59:49.841676 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:49 crc kubenswrapper[4939]: I0318 16:59:49.842014 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:49 crc kubenswrapper[4939]: I0318 16:59:49.903570 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:50 crc kubenswrapper[4939]: I0318 16:59:50.145142 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" path="/var/lib/kubelet/pods/5a2757a9-9f09-42f0-aa97-114639391b72/volumes" Mar 18 16:59:50 crc kubenswrapper[4939]: I0318 16:59:50.338374 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 16:59:50 crc kubenswrapper[4939]: I0318 16:59:50.914226 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 16:59:51 crc kubenswrapper[4939]: I0318 16:59:50.999962 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.225179 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wlhlb"] Mar 18 16:59:57 crc kubenswrapper[4939]: E0318 16:59:57.226724 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="init" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.226745 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="init" Mar 18 16:59:57 crc kubenswrapper[4939]: E0318 16:59:57.226759 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="dnsmasq-dns" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.226767 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="dnsmasq-dns" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.226932 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2757a9-9f09-42f0-aa97-114639391b72" containerName="dnsmasq-dns" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.227860 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.231725 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.233963 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wlhlb"] Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.278821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.278893 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4m97\" (UniqueName: \"kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.380207 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.380308 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4m97\" (UniqueName: \"kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.381170 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.414856 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4m97\" (UniqueName: \"kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97\") pod \"root-account-create-update-wlhlb\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:57 crc kubenswrapper[4939]: I0318 16:59:57.552311 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wlhlb" Mar 18 16:59:58 crc kubenswrapper[4939]: I0318 16:59:58.031911 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wlhlb"] Mar 18 16:59:58 crc kubenswrapper[4939]: I0318 16:59:58.334920 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wlhlb" event={"ID":"4a0fd118-7019-43ce-a555-7a0c7194c078","Type":"ContainerStarted","Data":"f2615f9e286ee139aefb3888552cd8ae25a15c92ae5a8cc265f570723a155518"} Mar 18 16:59:59 crc kubenswrapper[4939]: I0318 16:59:59.343122 4939 generic.go:334] "Generic (PLEG): container finished" podID="4a0fd118-7019-43ce-a555-7a0c7194c078" containerID="a49ef056d058ce9b0b6dff21d06f4cc75f7214945b51137dae003a0c8dd8d56e" exitCode=0 Mar 18 16:59:59 crc kubenswrapper[4939]: I0318 16:59:59.343187 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wlhlb" event={"ID":"4a0fd118-7019-43ce-a555-7a0c7194c078","Type":"ContainerDied","Data":"a49ef056d058ce9b0b6dff21d06f4cc75f7214945b51137dae003a0c8dd8d56e"} Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.145186 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558"] Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.146911 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564220-l5rl6"] Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.147184 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.147848 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.150953 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.151220 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.151622 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.151833 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.155344 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.159226 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-l5rl6"] Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.173394 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558"] Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.327300 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.327366 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989l4\" (UniqueName: \"kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4\") pod \"auto-csr-approver-29564220-l5rl6\" (UID: \"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd\") " pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.327433 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcs92\" (UniqueName: \"kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.327515 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.430356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcs92\" (UniqueName: \"kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.430439 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.430553 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.430585 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989l4\" (UniqueName: \"kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4\") pod \"auto-csr-approver-29564220-l5rl6\" (UID: \"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd\") " pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.431626 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.436822 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.454247 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcs92\" (UniqueName: \"kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92\") pod \"collect-profiles-29564220-gp558\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.455546 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989l4\" (UniqueName: \"kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4\") pod \"auto-csr-approver-29564220-l5rl6\" (UID: \"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd\") " pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.481687 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.489916 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.655047 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wlhlb" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.839573 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts\") pod \"4a0fd118-7019-43ce-a555-7a0c7194c078\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.839636 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4m97\" (UniqueName: \"kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97\") pod \"4a0fd118-7019-43ce-a555-7a0c7194c078\" (UID: \"4a0fd118-7019-43ce-a555-7a0c7194c078\") " Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.840777 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a0fd118-7019-43ce-a555-7a0c7194c078" (UID: "4a0fd118-7019-43ce-a555-7a0c7194c078"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.843250 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97" (OuterVolumeSpecName: "kube-api-access-x4m97") pod "4a0fd118-7019-43ce-a555-7a0c7194c078" (UID: "4a0fd118-7019-43ce-a555-7a0c7194c078"). InnerVolumeSpecName "kube-api-access-x4m97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.941769 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4m97\" (UniqueName: \"kubernetes.io/projected/4a0fd118-7019-43ce-a555-7a0c7194c078-kube-api-access-x4m97\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.941810 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0fd118-7019-43ce-a555-7a0c7194c078-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:00 crc kubenswrapper[4939]: I0318 17:00:00.943835 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-l5rl6"] Mar 18 17:00:00 crc kubenswrapper[4939]: W0318 17:00:00.948256 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92d6b9b7_dfb2_43a2_b271_521a40d3f8fd.slice/crio-d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361 WatchSource:0}: Error finding container d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361: Status 404 returned error can't find the container with id d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361 Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.007528 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558"] Mar 18 17:00:01 crc kubenswrapper[4939]: W0318 17:00:01.010740 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1592467f_5d46_42b1_afe7_a9173a8f4de5.slice/crio-625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c WatchSource:0}: Error finding container 625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c: Status 404 returned error can't find the container with id 625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.359750 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wlhlb" event={"ID":"4a0fd118-7019-43ce-a555-7a0c7194c078","Type":"ContainerDied","Data":"f2615f9e286ee139aefb3888552cd8ae25a15c92ae5a8cc265f570723a155518"} Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.360159 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2615f9e286ee139aefb3888552cd8ae25a15c92ae5a8cc265f570723a155518" Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.359820 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wlhlb" Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.361193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" event={"ID":"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd","Type":"ContainerStarted","Data":"d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361"} Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.363400 4939 generic.go:334] "Generic (PLEG): container finished" podID="1592467f-5d46-42b1-afe7-a9173a8f4de5" containerID="7b54f82f7763ad1e4ea85883a44a6a499162fee9458eeddfd28a47840b7b1e2e" exitCode=0 Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.363432 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" event={"ID":"1592467f-5d46-42b1-afe7-a9173a8f4de5","Type":"ContainerDied","Data":"7b54f82f7763ad1e4ea85883a44a6a499162fee9458eeddfd28a47840b7b1e2e"} Mar 18 17:00:01 crc kubenswrapper[4939]: I0318 17:00:01.363449 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" event={"ID":"1592467f-5d46-42b1-afe7-a9173a8f4de5","Type":"ContainerStarted","Data":"625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c"} Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.757194 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.868125 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcs92\" (UniqueName: \"kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92\") pod \"1592467f-5d46-42b1-afe7-a9173a8f4de5\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.868303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume\") pod \"1592467f-5d46-42b1-afe7-a9173a8f4de5\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.868380 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume\") pod \"1592467f-5d46-42b1-afe7-a9173a8f4de5\" (UID: \"1592467f-5d46-42b1-afe7-a9173a8f4de5\") " Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.868972 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume" (OuterVolumeSpecName: "config-volume") pod "1592467f-5d46-42b1-afe7-a9173a8f4de5" (UID: "1592467f-5d46-42b1-afe7-a9173a8f4de5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.873887 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1592467f-5d46-42b1-afe7-a9173a8f4de5" (UID: "1592467f-5d46-42b1-afe7-a9173a8f4de5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.874653 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92" (OuterVolumeSpecName: "kube-api-access-mcs92") pod "1592467f-5d46-42b1-afe7-a9173a8f4de5" (UID: "1592467f-5d46-42b1-afe7-a9173a8f4de5"). InnerVolumeSpecName "kube-api-access-mcs92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.969478 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcs92\" (UniqueName: \"kubernetes.io/projected/1592467f-5d46-42b1-afe7-a9173a8f4de5-kube-api-access-mcs92\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.969786 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1592467f-5d46-42b1-afe7-a9173a8f4de5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:02 crc kubenswrapper[4939]: I0318 17:00:02.969866 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1592467f-5d46-42b1-afe7-a9173a8f4de5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.148589 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wlhlb"] Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.154322 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wlhlb"] Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.382031 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" event={"ID":"1592467f-5d46-42b1-afe7-a9173a8f4de5","Type":"ContainerDied","Data":"625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c"} Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.382067 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625df2f3a71cc010b2ed4fb5f82e5810da1aa21eb813a55170daeb8991d6135c" Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.382115 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558" Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.837108 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4"] Mar 18 17:00:03 crc kubenswrapper[4939]: I0318 17:00:03.845097 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-bkzb4"] Mar 18 17:00:04 crc kubenswrapper[4939]: I0318 17:00:04.149978 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0fd118-7019-43ce-a555-7a0c7194c078" path="/var/lib/kubelet/pods/4a0fd118-7019-43ce-a555-7a0c7194c078/volumes" Mar 18 17:00:04 crc kubenswrapper[4939]: I0318 17:00:04.150909 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67eacf98-54a3-405f-8c43-e805fdce4a12" path="/var/lib/kubelet/pods/67eacf98-54a3-405f-8c43-e805fdce4a12/volumes" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.165766 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gztwr"] Mar 18 17:00:08 crc kubenswrapper[4939]: E0318 17:00:08.166419 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1592467f-5d46-42b1-afe7-a9173a8f4de5" containerName="collect-profiles" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.166432 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1592467f-5d46-42b1-afe7-a9173a8f4de5" containerName="collect-profiles" Mar 18 17:00:08 crc kubenswrapper[4939]: E0318 17:00:08.166450 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0fd118-7019-43ce-a555-7a0c7194c078" containerName="mariadb-account-create-update" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.166457 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0fd118-7019-43ce-a555-7a0c7194c078" containerName="mariadb-account-create-update" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.166617 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0fd118-7019-43ce-a555-7a0c7194c078" containerName="mariadb-account-create-update" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.166636 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1592467f-5d46-42b1-afe7-a9173a8f4de5" containerName="collect-profiles" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.167082 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.169094 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.176628 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gztwr"] Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.364629 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.364872 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4qc\" (UniqueName: \"kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.427844 4939 generic.go:334] "Generic (PLEG): container finished" podID="92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" containerID="cbfe860e6801aab6e2503bd8f035f75bd2b1dbd53d5fa2dada71103ad8a25066" exitCode=0 Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.427886 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" event={"ID":"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd","Type":"ContainerDied","Data":"cbfe860e6801aab6e2503bd8f035f75bd2b1dbd53d5fa2dada71103ad8a25066"} Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.466638 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4qc\" (UniqueName: \"kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.466860 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.467698 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.486556 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4qc\" (UniqueName: \"kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc\") pod \"root-account-create-update-gztwr\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:08 crc kubenswrapper[4939]: I0318 17:00:08.785667 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.261009 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gztwr"] Mar 18 17:00:09 crc kubenswrapper[4939]: W0318 17:00:09.269159 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcbdd35a_4e99_4d2a_adf1_111ced1b750a.slice/crio-3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648 WatchSource:0}: Error finding container 3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648: Status 404 returned error can't find the container with id 3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648 Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.439293 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gztwr" event={"ID":"bcbdd35a-4e99-4d2a-adf1-111ced1b750a","Type":"ContainerStarted","Data":"46c790e2cab895c66874bedb067cba29b1d3ddc88748a4dcc5ff1d0bda7a98f6"} Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.439825 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gztwr" event={"ID":"bcbdd35a-4e99-4d2a-adf1-111ced1b750a","Type":"ContainerStarted","Data":"3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648"} Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.470751 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gztwr" podStartSLOduration=1.4707266909999999 podStartE2EDuration="1.470726691s" podCreationTimestamp="2026-03-18 17:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:09.452910176 +0000 UTC m=+4974.052097807" watchObservedRunningTime="2026-03-18 17:00:09.470726691 +0000 UTC m=+4974.069914312" Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.736951 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.891897 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989l4\" (UniqueName: \"kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4\") pod \"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd\" (UID: \"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd\") " Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.897938 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4" (OuterVolumeSpecName: "kube-api-access-989l4") pod "92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" (UID: "92d6b9b7-dfb2-43a2-b271-521a40d3f8fd"). InnerVolumeSpecName "kube-api-access-989l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:09 crc kubenswrapper[4939]: I0318 17:00:09.993625 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989l4\" (UniqueName: \"kubernetes.io/projected/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd-kube-api-access-989l4\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.446762 4939 generic.go:334] "Generic (PLEG): container finished" podID="bcbdd35a-4e99-4d2a-adf1-111ced1b750a" containerID="46c790e2cab895c66874bedb067cba29b1d3ddc88748a4dcc5ff1d0bda7a98f6" exitCode=0 Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.447178 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gztwr" event={"ID":"bcbdd35a-4e99-4d2a-adf1-111ced1b750a","Type":"ContainerDied","Data":"46c790e2cab895c66874bedb067cba29b1d3ddc88748a4dcc5ff1d0bda7a98f6"} Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.448623 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" event={"ID":"92d6b9b7-dfb2-43a2-b271-521a40d3f8fd","Type":"ContainerDied","Data":"d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361"} Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.448656 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5ffee9c2f45efa15d691158263e55d3e28a9caec27908b8b568ece075baa361" Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.448700 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-l5rl6" Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.804124 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-n4cml"] Mar 18 17:00:10 crc kubenswrapper[4939]: I0318 17:00:10.814320 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-n4cml"] Mar 18 17:00:11 crc kubenswrapper[4939]: I0318 17:00:11.774457 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:11 crc kubenswrapper[4939]: I0318 17:00:11.918795 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts\") pod \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " Mar 18 17:00:11 crc kubenswrapper[4939]: I0318 17:00:11.918893 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx4qc\" (UniqueName: \"kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc\") pod \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\" (UID: \"bcbdd35a-4e99-4d2a-adf1-111ced1b750a\") " Mar 18 17:00:11 crc kubenswrapper[4939]: I0318 17:00:11.919784 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcbdd35a-4e99-4d2a-adf1-111ced1b750a" (UID: "bcbdd35a-4e99-4d2a-adf1-111ced1b750a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:11 crc kubenswrapper[4939]: I0318 17:00:11.923689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc" (OuterVolumeSpecName: "kube-api-access-rx4qc") pod "bcbdd35a-4e99-4d2a-adf1-111ced1b750a" (UID: "bcbdd35a-4e99-4d2a-adf1-111ced1b750a"). InnerVolumeSpecName "kube-api-access-rx4qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.020747 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.020786 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx4qc\" (UniqueName: \"kubernetes.io/projected/bcbdd35a-4e99-4d2a-adf1-111ced1b750a-kube-api-access-rx4qc\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.146426 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b025d40-8c7d-437c-9068-f2a9709185e1" path="/var/lib/kubelet/pods/9b025d40-8c7d-437c-9068-f2a9709185e1/volumes" Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.463207 4939 generic.go:334] "Generic (PLEG): container finished" podID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerID="cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3" exitCode=0 Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.463306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerDied","Data":"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3"} Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.465389 4939 generic.go:334] "Generic (PLEG): container finished" podID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerID="a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809" exitCode=0 Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.465494 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerDied","Data":"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809"} Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.468276 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gztwr" event={"ID":"bcbdd35a-4e99-4d2a-adf1-111ced1b750a","Type":"ContainerDied","Data":"3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648"} Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.468322 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ecd71b5ad3f70f944c53de6eb8e540f87801c984e20b790bad8b50221938648" Mar 18 17:00:12 crc kubenswrapper[4939]: I0318 17:00:12.468543 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gztwr" Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.479351 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerStarted","Data":"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1"} Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.480826 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.483443 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerStarted","Data":"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d"} Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.483798 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.505399 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.505379067 podStartE2EDuration="37.505379067s" podCreationTimestamp="2026-03-18 16:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:13.499936083 +0000 UTC m=+4978.099123704" watchObservedRunningTime="2026-03-18 17:00:13.505379067 +0000 UTC m=+4978.104566688" Mar 18 17:00:13 crc kubenswrapper[4939]: I0318 17:00:13.525836 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.525813617 podStartE2EDuration="37.525813617s" podCreationTimestamp="2026-03-18 16:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:13.520869647 +0000 UTC m=+4978.120057268" watchObservedRunningTime="2026-03-18 17:00:13.525813617 +0000 UTC m=+4978.125001248" Mar 18 17:00:23 crc kubenswrapper[4939]: I0318 17:00:23.687545 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:00:23 crc kubenswrapper[4939]: I0318 17:00:23.688141 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:00:27 crc kubenswrapper[4939]: I0318 17:00:27.842878 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 17:00:28 crc kubenswrapper[4939]: I0318 17:00:28.110691 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.911354 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:00:30 crc kubenswrapper[4939]: E0318 17:00:30.912081 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbdd35a-4e99-4d2a-adf1-111ced1b750a" containerName="mariadb-account-create-update" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.912099 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbdd35a-4e99-4d2a-adf1-111ced1b750a" containerName="mariadb-account-create-update" Mar 18 17:00:30 crc kubenswrapper[4939]: E0318 17:00:30.912125 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" containerName="oc" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.912133 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" containerName="oc" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.912336 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" containerName="oc" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.912364 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbdd35a-4e99-4d2a-adf1-111ced1b750a" containerName="mariadb-account-create-update" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.913362 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:30 crc kubenswrapper[4939]: I0318 17:00:30.920731 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.005752 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.006052 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxxz\" (UniqueName: \"kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.006258 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.108077 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxxz\" (UniqueName: \"kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.108158 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.108198 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.109104 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.109248 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.129067 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxxz\" (UniqueName: \"kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz\") pod \"dnsmasq-dns-5b7946d7b9-t4ggw\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.233206 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.690485 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:00:31 crc kubenswrapper[4939]: I0318 17:00:31.802029 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:32 crc kubenswrapper[4939]: I0318 17:00:32.611768 4939 generic.go:334] "Generic (PLEG): container finished" podID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerID="da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff" exitCode=0 Mar 18 17:00:32 crc kubenswrapper[4939]: I0318 17:00:32.611821 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" event={"ID":"b8031ced-8ac8-4318-87b1-2f22da0b05f1","Type":"ContainerDied","Data":"da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff"} Mar 18 17:00:32 crc kubenswrapper[4939]: I0318 17:00:32.612093 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" event={"ID":"b8031ced-8ac8-4318-87b1-2f22da0b05f1","Type":"ContainerStarted","Data":"782ca91aa2991a34c0dba0c2cff85dee1db31ae22263875f67a89c502b77990b"} Mar 18 17:00:32 crc kubenswrapper[4939]: I0318 17:00:32.836269 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:33 crc kubenswrapper[4939]: I0318 17:00:33.620388 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" event={"ID":"b8031ced-8ac8-4318-87b1-2f22da0b05f1","Type":"ContainerStarted","Data":"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d"} Mar 18 17:00:33 crc kubenswrapper[4939]: I0318 17:00:33.620823 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:33 crc kubenswrapper[4939]: I0318 17:00:33.654054 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" podStartSLOduration=3.654025558 podStartE2EDuration="3.654025558s" podCreationTimestamp="2026-03-18 17:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:33.647315897 +0000 UTC m=+4998.246503528" watchObservedRunningTime="2026-03-18 17:00:33.654025558 +0000 UTC m=+4998.253213179" Mar 18 17:00:33 crc kubenswrapper[4939]: I0318 17:00:33.710611 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="rabbitmq" containerID="cri-o://2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1" gracePeriod=604799 Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.708606 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.710071 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.720692 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.819380 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="rabbitmq" containerID="cri-o://a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d" gracePeriod=604799 Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.864382 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.864434 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtb5\" (UniqueName: \"kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.864579 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.965647 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.965708 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtb5\" (UniqueName: \"kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.965810 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.966263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.966315 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:34 crc kubenswrapper[4939]: I0318 17:00:34.985331 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtb5\" (UniqueName: \"kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5\") pod \"redhat-operators-v78h2\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.035096 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.251857 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.638170 4939 generic.go:334] "Generic (PLEG): container finished" podID="90154295-728d-471c-a815-d34867ba02fb" containerID="f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07" exitCode=0 Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.638222 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerDied","Data":"f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07"} Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.638263 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerStarted","Data":"996d45a72dfab31f59313a8a0986c0d4ded85924af7ed65bc59c50dea54028c1"} Mar 18 17:00:35 crc kubenswrapper[4939]: I0318 17:00:35.640062 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:00:37 crc kubenswrapper[4939]: I0318 17:00:37.653441 4939 generic.go:334] "Generic (PLEG): container finished" podID="90154295-728d-471c-a815-d34867ba02fb" containerID="501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c" exitCode=0 Mar 18 17:00:37 crc kubenswrapper[4939]: I0318 17:00:37.653707 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerDied","Data":"501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c"} Mar 18 17:00:37 crc kubenswrapper[4939]: I0318 17:00:37.841082 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.20:5672: connect: connection refused" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.110491 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.21:5672: connect: connection refused" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.667214 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerStarted","Data":"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce"} Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.686825 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.688016 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.689607 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lhwrp" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.695872 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.697830 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v78h2" podStartSLOduration=2.262125877 podStartE2EDuration="4.697804243s" podCreationTimestamp="2026-03-18 17:00:34 +0000 UTC" firstStartedPulling="2026-03-18 17:00:35.639816405 +0000 UTC m=+5000.239004016" lastFinishedPulling="2026-03-18 17:00:38.075494761 +0000 UTC m=+5002.674682382" observedRunningTime="2026-03-18 17:00:38.688660383 +0000 UTC m=+5003.287848004" watchObservedRunningTime="2026-03-18 17:00:38.697804243 +0000 UTC m=+5003.296991864" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.831900 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwfd\" (UniqueName: \"kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd\") pod \"mariadb-client\" (UID: \"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff\") " pod="openstack/mariadb-client" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.932970 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwfd\" (UniqueName: \"kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd\") pod \"mariadb-client\" (UID: \"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff\") " pod="openstack/mariadb-client" Mar 18 17:00:38 crc kubenswrapper[4939]: I0318 17:00:38.951856 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwfd\" (UniqueName: \"kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd\") pod \"mariadb-client\" (UID: \"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff\") " pod="openstack/mariadb-client" Mar 18 17:00:39 crc kubenswrapper[4939]: I0318 17:00:39.004950 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:00:39 crc kubenswrapper[4939]: I0318 17:00:39.513392 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:39 crc kubenswrapper[4939]: W0318 17:00:39.517382 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1b99f6_29aa_470b_9bc2_f0ef382d4eff.slice/crio-e7e272c3a054994d3c794767f8c79b8ebc95fa88b5eef8bc5cd0914abf5e1f04 WatchSource:0}: Error finding container e7e272c3a054994d3c794767f8c79b8ebc95fa88b5eef8bc5cd0914abf5e1f04: Status 404 returned error can't find the container with id e7e272c3a054994d3c794767f8c79b8ebc95fa88b5eef8bc5cd0914abf5e1f04 Mar 18 17:00:39 crc kubenswrapper[4939]: I0318 17:00:39.678707 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff","Type":"ContainerStarted","Data":"62cec8a322b7da53a06b2b760567fe94f901ca0eef832c7f500277e1d9fb4a4d"} Mar 18 17:00:39 crc kubenswrapper[4939]: I0318 17:00:39.678761 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff","Type":"ContainerStarted","Data":"e7e272c3a054994d3c794767f8c79b8ebc95fa88b5eef8bc5cd0914abf5e1f04"} Mar 18 17:00:39 crc kubenswrapper[4939]: I0318 17:00:39.691918 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.691877055 podStartE2EDuration="1.691877055s" podCreationTimestamp="2026-03-18 17:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:39.690784034 +0000 UTC m=+5004.289971675" watchObservedRunningTime="2026-03-18 17:00:39.691877055 +0000 UTC m=+5004.291064676" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.287294 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.363904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364262 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364294 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364345 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc6jj\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364403 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364422 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364458 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364545 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf\") pod \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\" (UID: \"0353a9a1-39a6-4993-ac6e-dce5eba1373f\") " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364765 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364900 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.364916 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.365213 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.372403 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info" (OuterVolumeSpecName: "pod-info") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.373093 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.373815 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj" (OuterVolumeSpecName: "kube-api-access-cc6jj") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "kube-api-access-cc6jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.391005 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf" (OuterVolumeSpecName: "server-conf") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.391227 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13" (OuterVolumeSpecName: "persistence") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "pvc-769407f2-5c79-4663-8608-aa43074dfd13". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.443315 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0353a9a1-39a6-4993-ac6e-dce5eba1373f" (UID: "0353a9a1-39a6-4993-ac6e-dce5eba1373f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.466723 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.466806 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") on node \"crc\" " Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474679 4939 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0353a9a1-39a6-4993-ac6e-dce5eba1373f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474736 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc6jj\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-kube-api-access-cc6jj\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474749 4939 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474758 4939 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0353a9a1-39a6-4993-ac6e-dce5eba1373f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474768 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0353a9a1-39a6-4993-ac6e-dce5eba1373f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.474781 4939 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0353a9a1-39a6-4993-ac6e-dce5eba1373f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.485176 4939 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.485321 4939 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-769407f2-5c79-4663-8608-aa43074dfd13" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13") on node "crc" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.576139 4939 reconciler_common.go:293] "Volume detached for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.686675 4939 generic.go:334] "Generic (PLEG): container finished" podID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerID="2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1" exitCode=0 Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.686731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerDied","Data":"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1"} Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.686761 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.686795 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0353a9a1-39a6-4993-ac6e-dce5eba1373f","Type":"ContainerDied","Data":"689c480cdabd0ff64befc721469e3e2aeb3be9badf3cc9b020befe5809dd07fa"} Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.686818 4939 scope.go:117] "RemoveContainer" containerID="2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.707439 4939 scope.go:117] "RemoveContainer" containerID="cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.723576 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.731612 4939 scope.go:117] "RemoveContainer" containerID="2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.731740 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:40 crc kubenswrapper[4939]: E0318 17:00:40.732137 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1\": container with ID starting with 2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1 not found: ID does not exist" containerID="2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.732185 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1"} err="failed to get container status \"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1\": rpc error: code = NotFound desc = could not find container \"2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1\": container with ID starting with 2a380452b0a5732e82bfb463ebb59d10c1b921590ab375b3a79a5960697786e1 not found: ID does not exist" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.732220 4939 scope.go:117] "RemoveContainer" containerID="cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3" Mar 18 17:00:40 crc kubenswrapper[4939]: E0318 17:00:40.732815 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3\": container with ID starting with cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3 not found: ID does not exist" containerID="cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.732845 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3"} err="failed to get container status \"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3\": rpc error: code = NotFound desc = could not find container \"cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3\": container with ID starting with cf978d39169ba13fee515ded5d685cbbde8ad799692f55de38df1923f90833c3 not found: ID does not exist" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.751715 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:40 crc kubenswrapper[4939]: E0318 17:00:40.752318 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="rabbitmq" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.752346 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="rabbitmq" Mar 18 17:00:40 crc kubenswrapper[4939]: E0318 17:00:40.752387 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="setup-container" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.752396 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="setup-container" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.752584 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" containerName="rabbitmq" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.753566 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.755309 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.755583 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fjptp" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.755606 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.755641 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.755800 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.771564 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883056 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgn6\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-kube-api-access-ddgn6\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883116 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883143 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883169 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883208 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/338caa35-af66-4082-bd19-e7b9d11e42de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883404 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883442 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.883477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/338caa35-af66-4082-bd19-e7b9d11e42de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985453 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985566 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985611 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985641 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/338caa35-af66-4082-bd19-e7b9d11e42de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgn6\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-kube-api-access-ddgn6\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985747 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985772 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985799 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.985843 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/338caa35-af66-4082-bd19-e7b9d11e42de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.989670 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.990275 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.990297 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:40 crc kubenswrapper[4939]: I0318 17:00:40.991805 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/338caa35-af66-4082-bd19-e7b9d11e42de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.009406 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/338caa35-af66-4082-bd19-e7b9d11e42de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.009453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/338caa35-af66-4082-bd19-e7b9d11e42de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.010020 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.010057 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6078a2acfc90786553dde1ebd9b0d0a94de325ff855324a9a16d94502664ff27/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.014945 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgn6\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-kube-api-access-ddgn6\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.037819 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/338caa35-af66-4082-bd19-e7b9d11e42de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.089131 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-769407f2-5c79-4663-8608-aa43074dfd13\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-769407f2-5c79-4663-8608-aa43074dfd13\") pod \"rabbitmq-server-0\" (UID: \"338caa35-af66-4082-bd19-e7b9d11e42de\") " pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.235690 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.254861 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.288393 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.288714 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="dnsmasq-dns" containerID="cri-o://93a35d0ed6476b2c59b6a8982d411ed7a96e7633345ee7a537dab200d353e489" gracePeriod=10 Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.380956 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.400984 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401035 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401065 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401111 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wc7\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401190 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401211 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401230 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401267 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.401292 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd\") pod \"08684f42-6dc7-4e6d-8073-98c558c159b4\" (UID: \"08684f42-6dc7-4e6d-8073-98c558c159b4\") " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.403189 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.404691 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.405098 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.406065 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info" (OuterVolumeSpecName: "pod-info") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.407642 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7" (OuterVolumeSpecName: "kube-api-access-z4wc7") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "kube-api-access-z4wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.408020 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.422193 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e" (OuterVolumeSpecName: "persistence") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.446322 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf" (OuterVolumeSpecName: "server-conf") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.488700 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "08684f42-6dc7-4e6d-8073-98c558c159b4" (UID: "08684f42-6dc7-4e6d-8073-98c558c159b4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.503922 4939 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08684f42-6dc7-4e6d-8073-98c558c159b4-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.503944 4939 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08684f42-6dc7-4e6d-8073-98c558c159b4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.503956 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.503966 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4wc7\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-kube-api-access-z4wc7\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.503994 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") on node \"crc\" " Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.504013 4939 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.504022 4939 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08684f42-6dc7-4e6d-8073-98c558c159b4-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.504031 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.504042 4939 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08684f42-6dc7-4e6d-8073-98c558c159b4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.519435 4939 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.519686 4939 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e") on node "crc" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.605713 4939 reconciler_common.go:293] "Volume detached for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.696674 4939 generic.go:334] "Generic (PLEG): container finished" podID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerID="a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d" exitCode=0 Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.696736 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerDied","Data":"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d"} Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.696764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08684f42-6dc7-4e6d-8073-98c558c159b4","Type":"ContainerDied","Data":"9ac5a4f6c3ab500b0872c29f72f1827fca46c565bda6de290dffd25b95fa8b83"} Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.696787 4939 scope.go:117] "RemoveContainer" containerID="a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.696930 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.705705 4939 generic.go:334] "Generic (PLEG): container finished" podID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerID="93a35d0ed6476b2c59b6a8982d411ed7a96e7633345ee7a537dab200d353e489" exitCode=0 Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.705749 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" event={"ID":"c3d05697-55c9-4f3b-af58-79afb3af5683","Type":"ContainerDied","Data":"93a35d0ed6476b2c59b6a8982d411ed7a96e7633345ee7a537dab200d353e489"} Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.751313 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.772354 4939 scope.go:117] "RemoveContainer" containerID="a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.780338 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.788293 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:41 crc kubenswrapper[4939]: E0318 17:00:41.788764 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="setup-container" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.788787 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="setup-container" Mar 18 17:00:41 crc kubenswrapper[4939]: E0318 17:00:41.788810 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="rabbitmq" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.788819 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="rabbitmq" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.789010 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" containerName="rabbitmq" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.790307 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.792530 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5njsn" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.792697 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.792995 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.793163 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.793317 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.795908 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.853905 4939 scope.go:117] "RemoveContainer" containerID="a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d" Mar 18 17:00:41 crc kubenswrapper[4939]: E0318 17:00:41.859639 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d\": container with ID starting with a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d not found: ID does not exist" containerID="a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.859678 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d"} err="failed to get container status \"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d\": rpc error: code = NotFound desc = could not find container \"a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d\": container with ID starting with a922f6fba42375e531840c135e7d677d065262fb75a14a84c979fd58bad6c56d not found: ID does not exist" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.859702 4939 scope.go:117] "RemoveContainer" containerID="a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809" Mar 18 17:00:41 crc kubenswrapper[4939]: E0318 17:00:41.863708 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809\": container with ID starting with a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809 not found: ID does not exist" containerID="a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.863751 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809"} err="failed to get container status \"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809\": rpc error: code = NotFound desc = could not find container \"a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809\": container with ID starting with a3c431df0944c5622dc414a73421e1b630e328c96353daa846897c5ebd17e809 not found: ID does not exist" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911003 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911059 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911085 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911108 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52k2\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-kube-api-access-n52k2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911171 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911227 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911270 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.911285 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:41 crc kubenswrapper[4939]: I0318 17:00:41.935759 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012038 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc\") pod \"c3d05697-55c9-4f3b-af58-79afb3af5683\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012169 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58sv9\" (UniqueName: \"kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9\") pod \"c3d05697-55c9-4f3b-af58-79afb3af5683\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012216 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config\") pod \"c3d05697-55c9-4f3b-af58-79afb3af5683\" (UID: \"c3d05697-55c9-4f3b-af58-79afb3af5683\") " Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012430 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012470 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012548 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012581 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012622 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52k2\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-kube-api-access-n52k2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012650 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012668 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.012708 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.013163 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.013667 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.013715 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.014518 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.015699 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.015740 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f58582c94789db6d268fd2ad4cad73469dce675c14a8bd7b9504d878706640d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.017343 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.017574 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.017825 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9" (OuterVolumeSpecName: "kube-api-access-58sv9") pod "c3d05697-55c9-4f3b-af58-79afb3af5683" (UID: "c3d05697-55c9-4f3b-af58-79afb3af5683"). InnerVolumeSpecName "kube-api-access-58sv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.032440 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.040941 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52k2\" (UniqueName: \"kubernetes.io/projected/e1594dd1-4c71-4f2d-9a6e-658ed43d121f-kube-api-access-n52k2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.041899 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 17:00:42 crc kubenswrapper[4939]: W0318 17:00:42.048075 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod338caa35_af66_4082_bd19_e7b9d11e42de.slice/crio-880536ef58022c10ce202217ef4b9670f485f49fa7fb28457b6300697397e676 WatchSource:0}: Error finding container 880536ef58022c10ce202217ef4b9670f485f49fa7fb28457b6300697397e676: Status 404 returned error can't find the container with id 880536ef58022c10ce202217ef4b9670f485f49fa7fb28457b6300697397e676 Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.054724 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config" (OuterVolumeSpecName: "config") pod "c3d05697-55c9-4f3b-af58-79afb3af5683" (UID: "c3d05697-55c9-4f3b-af58-79afb3af5683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.056177 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3d05697-55c9-4f3b-af58-79afb3af5683" (UID: "c3d05697-55c9-4f3b-af58-79afb3af5683"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.077007 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caba9-b8d9-4dbe-9a1e-4a1e84e3541e\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1594dd1-4c71-4f2d-9a6e-658ed43d121f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.114569 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.115030 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58sv9\" (UniqueName: \"kubernetes.io/projected/c3d05697-55c9-4f3b-af58-79afb3af5683-kube-api-access-58sv9\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.115079 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d05697-55c9-4f3b-af58-79afb3af5683-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.155184 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0353a9a1-39a6-4993-ac6e-dce5eba1373f" path="/var/lib/kubelet/pods/0353a9a1-39a6-4993-ac6e-dce5eba1373f/volumes" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.156717 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08684f42-6dc7-4e6d-8073-98c558c159b4" path="/var/lib/kubelet/pods/08684f42-6dc7-4e6d-8073-98c558c159b4/volumes" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.225783 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.713398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"338caa35-af66-4082-bd19-e7b9d11e42de","Type":"ContainerStarted","Data":"880536ef58022c10ce202217ef4b9670f485f49fa7fb28457b6300697397e676"} Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.715727 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" event={"ID":"c3d05697-55c9-4f3b-af58-79afb3af5683","Type":"ContainerDied","Data":"86409aa3786374bebda5ce7104bace04a938fed6614cdcc95fe7e0119af8a95e"} Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.715767 4939 scope.go:117] "RemoveContainer" containerID="93a35d0ed6476b2c59b6a8982d411ed7a96e7633345ee7a537dab200d353e489" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.715852 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.738993 4939 scope.go:117] "RemoveContainer" containerID="89b08e529ab7f1fb2239ef3a522cebf6ef8f3d11aa6a7e4722a2364b7baaaecf" Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.747295 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.759871 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-fxzkl"] Mar 18 17:00:42 crc kubenswrapper[4939]: I0318 17:00:42.855181 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 17:00:42 crc kubenswrapper[4939]: W0318 17:00:42.863455 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-2120e97d16d754e990aadafeec0074b710c7ec5224b3b09cb1eef712ca7ba1e4 WatchSource:0}: Error finding container 2120e97d16d754e990aadafeec0074b710c7ec5224b3b09cb1eef712ca7ba1e4: Status 404 returned error can't find the container with id 2120e97d16d754e990aadafeec0074b710c7ec5224b3b09cb1eef712ca7ba1e4 Mar 18 17:00:43 crc kubenswrapper[4939]: I0318 17:00:43.728683 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"338caa35-af66-4082-bd19-e7b9d11e42de","Type":"ContainerStarted","Data":"3847c46a6d92389ed93a8389a38f09890f72054ddce02d41ec26dcbacf5dd138"} Mar 18 17:00:43 crc kubenswrapper[4939]: I0318 17:00:43.730840 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1594dd1-4c71-4f2d-9a6e-658ed43d121f","Type":"ContainerStarted","Data":"2120e97d16d754e990aadafeec0074b710c7ec5224b3b09cb1eef712ca7ba1e4"} Mar 18 17:00:44 crc kubenswrapper[4939]: I0318 17:00:44.143627 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" path="/var/lib/kubelet/pods/c3d05697-55c9-4f3b-af58-79afb3af5683/volumes" Mar 18 17:00:44 crc kubenswrapper[4939]: I0318 17:00:44.739667 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1594dd1-4c71-4f2d-9a6e-658ed43d121f","Type":"ContainerStarted","Data":"531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f"} Mar 18 17:00:45 crc kubenswrapper[4939]: I0318 17:00:45.036267 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:45 crc kubenswrapper[4939]: I0318 17:00:45.036362 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:46 crc kubenswrapper[4939]: I0318 17:00:46.077873 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v78h2" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="registry-server" probeResult="failure" output=< Mar 18 17:00:46 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:00:46 crc kubenswrapper[4939]: > Mar 18 17:00:46 crc kubenswrapper[4939]: I0318 17:00:46.927598 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-98ddfc8f-fxzkl" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.19:5353: i/o timeout" Mar 18 17:00:53 crc kubenswrapper[4939]: I0318 17:00:53.687804 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:00:53 crc kubenswrapper[4939]: I0318 17:00:53.688353 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:00:54 crc kubenswrapper[4939]: I0318 17:00:54.607051 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:54 crc kubenswrapper[4939]: I0318 17:00:54.607611 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" containerName="mariadb-client" containerID="cri-o://62cec8a322b7da53a06b2b760567fe94f901ca0eef832c7f500277e1d9fb4a4d" gracePeriod=30 Mar 18 17:00:54 crc kubenswrapper[4939]: I0318 17:00:54.823761 4939 generic.go:334] "Generic (PLEG): container finished" podID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" containerID="62cec8a322b7da53a06b2b760567fe94f901ca0eef832c7f500277e1d9fb4a4d" exitCode=143 Mar 18 17:00:54 crc kubenswrapper[4939]: I0318 17:00:54.823805 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff","Type":"ContainerDied","Data":"62cec8a322b7da53a06b2b760567fe94f901ca0eef832c7f500277e1d9fb4a4d"} Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.088566 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.089964 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.149396 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.199759 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpwfd\" (UniqueName: \"kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd\") pod \"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff\" (UID: \"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff\") " Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.205946 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd" (OuterVolumeSpecName: "kube-api-access-mpwfd") pod "ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" (UID: "ee1b99f6-29aa-470b-9bc2-f0ef382d4eff"). InnerVolumeSpecName "kube-api-access-mpwfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.301785 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpwfd\" (UniqueName: \"kubernetes.io/projected/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff-kube-api-access-mpwfd\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.350934 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.830160 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee1b99f6-29aa-470b-9bc2-f0ef382d4eff","Type":"ContainerDied","Data":"e7e272c3a054994d3c794767f8c79b8ebc95fa88b5eef8bc5cd0914abf5e1f04"} Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.830201 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.830247 4939 scope.go:117] "RemoveContainer" containerID="62cec8a322b7da53a06b2b760567fe94f901ca0eef832c7f500277e1d9fb4a4d" Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.862918 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:55 crc kubenswrapper[4939]: I0318 17:00:55.869694 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:00:56 crc kubenswrapper[4939]: I0318 17:00:56.142187 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" path="/var/lib/kubelet/pods/ee1b99f6-29aa-470b-9bc2-f0ef382d4eff/volumes" Mar 18 17:00:56 crc kubenswrapper[4939]: I0318 17:00:56.838410 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v78h2" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="registry-server" containerID="cri-o://7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce" gracePeriod=2 Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.747598 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.847547 4939 generic.go:334] "Generic (PLEG): container finished" podID="90154295-728d-471c-a815-d34867ba02fb" containerID="7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce" exitCode=0 Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.847585 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerDied","Data":"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce"} Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.847608 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v78h2" event={"ID":"90154295-728d-471c-a815-d34867ba02fb","Type":"ContainerDied","Data":"996d45a72dfab31f59313a8a0986c0d4ded85924af7ed65bc59c50dea54028c1"} Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.847624 4939 scope.go:117] "RemoveContainer" containerID="7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.847714 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v78h2" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.863839 4939 scope.go:117] "RemoveContainer" containerID="501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.883990 4939 scope.go:117] "RemoveContainer" containerID="f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.909730 4939 scope.go:117] "RemoveContainer" containerID="7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce" Mar 18 17:00:57 crc kubenswrapper[4939]: E0318 17:00:57.910165 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce\": container with ID starting with 7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce not found: ID does not exist" containerID="7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.910224 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce"} err="failed to get container status \"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce\": rpc error: code = NotFound desc = could not find container \"7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce\": container with ID starting with 7b19a011510e993a09db5d67146a4baf9ec14e727a43cfb6bba65b3c256a18ce not found: ID does not exist" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.910264 4939 scope.go:117] "RemoveContainer" containerID="501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c" Mar 18 17:00:57 crc kubenswrapper[4939]: E0318 17:00:57.910723 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c\": container with ID starting with 501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c not found: ID does not exist" containerID="501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.910759 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c"} err="failed to get container status \"501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c\": rpc error: code = NotFound desc = could not find container \"501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c\": container with ID starting with 501b232401b0288fdaa7362d6e92babeac08963abcb09c495c50251cc4cf033c not found: ID does not exist" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.910780 4939 scope.go:117] "RemoveContainer" containerID="f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07" Mar 18 17:00:57 crc kubenswrapper[4939]: E0318 17:00:57.911112 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07\": container with ID starting with f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07 not found: ID does not exist" containerID="f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.911166 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07"} err="failed to get container status \"f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07\": rpc error: code = NotFound desc = could not find container \"f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07\": container with ID starting with f6657cd2adf1d0cff92adea87939df25eef1067bb045ddfff6875b16fd2ecf07 not found: ID does not exist" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.940758 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities\") pod \"90154295-728d-471c-a815-d34867ba02fb\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.940827 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content\") pod \"90154295-728d-471c-a815-d34867ba02fb\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.940930 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvtb5\" (UniqueName: \"kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5\") pod \"90154295-728d-471c-a815-d34867ba02fb\" (UID: \"90154295-728d-471c-a815-d34867ba02fb\") " Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.941633 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities" (OuterVolumeSpecName: "utilities") pod "90154295-728d-471c-a815-d34867ba02fb" (UID: "90154295-728d-471c-a815-d34867ba02fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:57 crc kubenswrapper[4939]: I0318 17:00:57.946085 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5" (OuterVolumeSpecName: "kube-api-access-zvtb5") pod "90154295-728d-471c-a815-d34867ba02fb" (UID: "90154295-728d-471c-a815-d34867ba02fb"). InnerVolumeSpecName "kube-api-access-zvtb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.043189 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvtb5\" (UniqueName: \"kubernetes.io/projected/90154295-728d-471c-a815-d34867ba02fb-kube-api-access-zvtb5\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.043220 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.073425 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90154295-728d-471c-a815-d34867ba02fb" (UID: "90154295-728d-471c-a815-d34867ba02fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.144156 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90154295-728d-471c-a815-d34867ba02fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.177619 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:00:58 crc kubenswrapper[4939]: I0318 17:00:58.184995 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v78h2"] Mar 18 17:01:00 crc kubenswrapper[4939]: I0318 17:01:00.035566 4939 scope.go:117] "RemoveContainer" containerID="1e2c55e7cc1e982c0c6586f59c782e5736028a183a4b700ff0c424646dc97d6d" Mar 18 17:01:00 crc kubenswrapper[4939]: I0318 17:01:00.071544 4939 scope.go:117] "RemoveContainer" containerID="24763a9dfc06050bd51c0531da4c66d785c619f72e793717a9240f4c49866986" Mar 18 17:01:00 crc kubenswrapper[4939]: I0318 17:01:00.141638 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90154295-728d-471c-a815-d34867ba02fb" path="/var/lib/kubelet/pods/90154295-728d-471c-a815-d34867ba02fb/volumes" Mar 18 17:01:15 crc kubenswrapper[4939]: I0318 17:01:15.982333 4939 generic.go:334] "Generic (PLEG): container finished" podID="338caa35-af66-4082-bd19-e7b9d11e42de" containerID="3847c46a6d92389ed93a8389a38f09890f72054ddce02d41ec26dcbacf5dd138" exitCode=0 Mar 18 17:01:15 crc kubenswrapper[4939]: I0318 17:01:15.982484 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"338caa35-af66-4082-bd19-e7b9d11e42de","Type":"ContainerDied","Data":"3847c46a6d92389ed93a8389a38f09890f72054ddce02d41ec26dcbacf5dd138"} Mar 18 17:01:16 crc kubenswrapper[4939]: I0318 17:01:16.995303 4939 generic.go:334] "Generic (PLEG): container finished" podID="e1594dd1-4c71-4f2d-9a6e-658ed43d121f" containerID="531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f" exitCode=0 Mar 18 17:01:16 crc kubenswrapper[4939]: I0318 17:01:16.995403 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1594dd1-4c71-4f2d-9a6e-658ed43d121f","Type":"ContainerDied","Data":"531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f"} Mar 18 17:01:16 crc kubenswrapper[4939]: I0318 17:01:16.999120 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"338caa35-af66-4082-bd19-e7b9d11e42de","Type":"ContainerStarted","Data":"2ff99abc3af0376822a16a03d32f79387fe5096f64e6d523526e8f9c0deded71"} Mar 18 17:01:16 crc kubenswrapper[4939]: I0318 17:01:16.999712 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 17:01:18 crc kubenswrapper[4939]: I0318 17:01:18.020854 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1594dd1-4c71-4f2d-9a6e-658ed43d121f","Type":"ContainerStarted","Data":"1635e11e577daa0b27fffc33a31c9574965e3e1934f4746aeb41472a54a92bd2"} Mar 18 17:01:18 crc kubenswrapper[4939]: I0318 17:01:18.021681 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:01:18 crc kubenswrapper[4939]: I0318 17:01:18.052069 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.052042538 podStartE2EDuration="37.052042538s" podCreationTimestamp="2026-03-18 17:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:01:18.050900355 +0000 UTC m=+5042.650087976" watchObservedRunningTime="2026-03-18 17:01:18.052042538 +0000 UTC m=+5042.651230159" Mar 18 17:01:18 crc kubenswrapper[4939]: I0318 17:01:18.054051 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.054041414 podStartE2EDuration="38.054041414s" podCreationTimestamp="2026-03-18 17:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:01:17.071966403 +0000 UTC m=+5041.671154034" watchObservedRunningTime="2026-03-18 17:01:18.054041414 +0000 UTC m=+5042.653229035" Mar 18 17:01:22 crc kubenswrapper[4939]: E0318 17:01:22.929068 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:01:23 crc kubenswrapper[4939]: I0318 17:01:23.687705 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:01:23 crc kubenswrapper[4939]: I0318 17:01:23.688180 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:01:23 crc kubenswrapper[4939]: I0318 17:01:23.688257 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:01:23 crc kubenswrapper[4939]: I0318 17:01:23.689318 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:01:23 crc kubenswrapper[4939]: I0318 17:01:23.689450 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5" gracePeriod=600 Mar 18 17:01:24 crc kubenswrapper[4939]: I0318 17:01:24.066900 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5" exitCode=0 Mar 18 17:01:24 crc kubenswrapper[4939]: I0318 17:01:24.066986 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5"} Mar 18 17:01:24 crc kubenswrapper[4939]: I0318 17:01:24.067314 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c"} Mar 18 17:01:24 crc kubenswrapper[4939]: I0318 17:01:24.067335 4939 scope.go:117] "RemoveContainer" containerID="8eb3d811bb583641c57cbe8f04513a05f15e751cb60dfe0b0dab5f9c20cd12be" Mar 18 17:01:31 crc kubenswrapper[4939]: I0318 17:01:31.383716 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 17:01:32 crc kubenswrapper[4939]: I0318 17:01:32.229829 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 17:01:33 crc kubenswrapper[4939]: E0318 17:01:33.101461 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:01:43 crc kubenswrapper[4939]: E0318 17:01:43.284354 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:01:53 crc kubenswrapper[4939]: E0318 17:01:53.443183 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.152684 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564222-bg4gf"] Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.160966 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="extract-utilities" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.161124 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="extract-utilities" Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.161252 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="extract-content" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.161333 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="extract-content" Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.161409 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="dnsmasq-dns" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.161889 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="dnsmasq-dns" Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.170853 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" containerName="mariadb-client" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.171259 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" containerName="mariadb-client" Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.171405 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="registry-server" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.171531 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="registry-server" Mar 18 17:02:00 crc kubenswrapper[4939]: E0318 17:02:00.171998 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="init" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.172024 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="init" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.172854 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="90154295-728d-471c-a815-d34867ba02fb" containerName="registry-server" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.172916 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d05697-55c9-4f3b-af58-79afb3af5683" containerName="dnsmasq-dns" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.172948 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1b99f6-29aa-470b-9bc2-f0ef382d4eff" containerName="mariadb-client" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.174184 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-bg4gf"] Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.174487 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.177208 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.178355 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.178754 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.234411 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4s8\" (UniqueName: \"kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8\") pod \"auto-csr-approver-29564222-bg4gf\" (UID: \"2b85d9ae-d037-4350-891f-679d3c14976c\") " pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.336505 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4s8\" (UniqueName: \"kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8\") pod \"auto-csr-approver-29564222-bg4gf\" (UID: \"2b85d9ae-d037-4350-891f-679d3c14976c\") " pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.356750 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4s8\" (UniqueName: \"kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8\") pod \"auto-csr-approver-29564222-bg4gf\" (UID: \"2b85d9ae-d037-4350-891f-679d3c14976c\") " pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.502439 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:00 crc kubenswrapper[4939]: I0318 17:02:00.918283 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-bg4gf"] Mar 18 17:02:01 crc kubenswrapper[4939]: I0318 17:02:01.367316 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" event={"ID":"2b85d9ae-d037-4350-891f-679d3c14976c","Type":"ContainerStarted","Data":"e432d27e5fc61fd0af7f4cd9887367e823edd575b434ce5306fec0f2f995c483"} Mar 18 17:02:03 crc kubenswrapper[4939]: I0318 17:02:03.383618 4939 generic.go:334] "Generic (PLEG): container finished" podID="2b85d9ae-d037-4350-891f-679d3c14976c" containerID="9c9e1a136379cebac60edec2b07b8959fc2de2c7897b29c909c1c5dbce87bd97" exitCode=0 Mar 18 17:02:03 crc kubenswrapper[4939]: I0318 17:02:03.383709 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" event={"ID":"2b85d9ae-d037-4350-891f-679d3c14976c","Type":"ContainerDied","Data":"9c9e1a136379cebac60edec2b07b8959fc2de2c7897b29c909c1c5dbce87bd97"} Mar 18 17:02:03 crc kubenswrapper[4939]: E0318 17:02:03.609156 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:02:04 crc kubenswrapper[4939]: I0318 17:02:04.657337 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:04 crc kubenswrapper[4939]: I0318 17:02:04.804034 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh4s8\" (UniqueName: \"kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8\") pod \"2b85d9ae-d037-4350-891f-679d3c14976c\" (UID: \"2b85d9ae-d037-4350-891f-679d3c14976c\") " Mar 18 17:02:04 crc kubenswrapper[4939]: I0318 17:02:04.810785 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8" (OuterVolumeSpecName: "kube-api-access-zh4s8") pod "2b85d9ae-d037-4350-891f-679d3c14976c" (UID: "2b85d9ae-d037-4350-891f-679d3c14976c"). InnerVolumeSpecName "kube-api-access-zh4s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:02:04 crc kubenswrapper[4939]: I0318 17:02:04.906206 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh4s8\" (UniqueName: \"kubernetes.io/projected/2b85d9ae-d037-4350-891f-679d3c14976c-kube-api-access-zh4s8\") on node \"crc\" DevicePath \"\"" Mar 18 17:02:05 crc kubenswrapper[4939]: I0318 17:02:05.400129 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" event={"ID":"2b85d9ae-d037-4350-891f-679d3c14976c","Type":"ContainerDied","Data":"e432d27e5fc61fd0af7f4cd9887367e823edd575b434ce5306fec0f2f995c483"} Mar 18 17:02:05 crc kubenswrapper[4939]: I0318 17:02:05.400476 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e432d27e5fc61fd0af7f4cd9887367e823edd575b434ce5306fec0f2f995c483" Mar 18 17:02:05 crc kubenswrapper[4939]: I0318 17:02:05.400206 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-bg4gf" Mar 18 17:02:05 crc kubenswrapper[4939]: I0318 17:02:05.730063 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-5pfct"] Mar 18 17:02:05 crc kubenswrapper[4939]: I0318 17:02:05.737906 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-5pfct"] Mar 18 17:02:06 crc kubenswrapper[4939]: I0318 17:02:06.143139 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a1a929-ec59-4086-aa65-2463bb2737f6" path="/var/lib/kubelet/pods/d9a1a929-ec59-4086-aa65-2463bb2737f6/volumes" Mar 18 17:02:13 crc kubenswrapper[4939]: E0318 17:02:13.790053 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1594dd1_4c71_4f2d_9a6e_658ed43d121f.slice/crio-531e107613928043060f0e4cbfc5b437a11173221c9c7e0bb3f995eb7bbe1d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:03:00 crc kubenswrapper[4939]: I0318 17:03:00.207133 4939 scope.go:117] "RemoveContainer" containerID="5d6464871921ab0029547e84e26c17e0bd6ade9dd7a11453edb46747ba7a8e76" Mar 18 17:03:00 crc kubenswrapper[4939]: I0318 17:03:00.246768 4939 scope.go:117] "RemoveContainer" containerID="233f294246630890dae0d42b33a0b5295fa58cf9e1ced9dc22bdb392c649c8fe" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.691775 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:12 crc kubenswrapper[4939]: E0318 17:03:12.705245 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b85d9ae-d037-4350-891f-679d3c14976c" containerName="oc" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.705533 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b85d9ae-d037-4350-891f-679d3c14976c" containerName="oc" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.705787 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b85d9ae-d037-4350-891f-679d3c14976c" containerName="oc" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.707185 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.707417 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.836948 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.837027 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.837074 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skkk\" (UniqueName: \"kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.938721 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.939096 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.939244 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skkk\" (UniqueName: \"kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.939422 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.939422 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:12 crc kubenswrapper[4939]: I0318 17:03:12.974496 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skkk\" (UniqueName: \"kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk\") pod \"redhat-marketplace-9q8r8\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:13 crc kubenswrapper[4939]: I0318 17:03:13.042349 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:13 crc kubenswrapper[4939]: I0318 17:03:13.484855 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:13 crc kubenswrapper[4939]: I0318 17:03:13.975620 4939 generic.go:334] "Generic (PLEG): container finished" podID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerID="ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535" exitCode=0 Mar 18 17:03:13 crc kubenswrapper[4939]: I0318 17:03:13.975690 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerDied","Data":"ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535"} Mar 18 17:03:13 crc kubenswrapper[4939]: I0318 17:03:13.975881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerStarted","Data":"50f3f15887b8875064443a5a6afafec01a7a0b74c1883c913784f2847c205e3a"} Mar 18 17:03:14 crc kubenswrapper[4939]: I0318 17:03:14.985461 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerStarted","Data":"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f"} Mar 18 17:03:15 crc kubenswrapper[4939]: I0318 17:03:15.995324 4939 generic.go:334] "Generic (PLEG): container finished" podID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerID="83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f" exitCode=0 Mar 18 17:03:15 crc kubenswrapper[4939]: I0318 17:03:15.995385 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerDied","Data":"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f"} Mar 18 17:03:17 crc kubenswrapper[4939]: I0318 17:03:17.022096 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerStarted","Data":"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e"} Mar 18 17:03:17 crc kubenswrapper[4939]: I0318 17:03:17.046462 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9q8r8" podStartSLOduration=2.5788989940000002 podStartE2EDuration="5.046443484s" podCreationTimestamp="2026-03-18 17:03:12 +0000 UTC" firstStartedPulling="2026-03-18 17:03:13.977691721 +0000 UTC m=+5158.576879342" lastFinishedPulling="2026-03-18 17:03:16.445236201 +0000 UTC m=+5161.044423832" observedRunningTime="2026-03-18 17:03:17.0396252 +0000 UTC m=+5161.638812851" watchObservedRunningTime="2026-03-18 17:03:17.046443484 +0000 UTC m=+5161.645631105" Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.043292 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.043889 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.098176 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.163563 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.343224 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.687781 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:03:23 crc kubenswrapper[4939]: I0318 17:03:23.688122 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:03:25 crc kubenswrapper[4939]: I0318 17:03:25.091705 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9q8r8" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="registry-server" containerID="cri-o://026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e" gracePeriod=2 Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.022753 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.102393 4939 generic.go:334] "Generic (PLEG): container finished" podID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerID="026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e" exitCode=0 Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.102544 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerDied","Data":"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e"} Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.102602 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q8r8" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.102605 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q8r8" event={"ID":"02df0286-af33-43e5-ab78-b99f88ecdf28","Type":"ContainerDied","Data":"50f3f15887b8875064443a5a6afafec01a7a0b74c1883c913784f2847c205e3a"} Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.102714 4939 scope.go:117] "RemoveContainer" containerID="026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.125573 4939 scope.go:117] "RemoveContainer" containerID="83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.146488 4939 scope.go:117] "RemoveContainer" containerID="ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.166002 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8skkk\" (UniqueName: \"kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk\") pod \"02df0286-af33-43e5-ab78-b99f88ecdf28\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.166250 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities\") pod \"02df0286-af33-43e5-ab78-b99f88ecdf28\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.166466 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content\") pod \"02df0286-af33-43e5-ab78-b99f88ecdf28\" (UID: \"02df0286-af33-43e5-ab78-b99f88ecdf28\") " Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.167397 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities" (OuterVolumeSpecName: "utilities") pod "02df0286-af33-43e5-ab78-b99f88ecdf28" (UID: "02df0286-af33-43e5-ab78-b99f88ecdf28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.174419 4939 scope.go:117] "RemoveContainer" containerID="026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e" Mar 18 17:03:26 crc kubenswrapper[4939]: E0318 17:03:26.175329 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e\": container with ID starting with 026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e not found: ID does not exist" containerID="026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.175382 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e"} err="failed to get container status \"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e\": rpc error: code = NotFound desc = could not find container \"026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e\": container with ID starting with 026191f8737dff748b6646c30a9441018d66e4ee8fdaadb413d6984efde5a66e not found: ID does not exist" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.175409 4939 scope.go:117] "RemoveContainer" containerID="83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.175729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk" (OuterVolumeSpecName: "kube-api-access-8skkk") pod "02df0286-af33-43e5-ab78-b99f88ecdf28" (UID: "02df0286-af33-43e5-ab78-b99f88ecdf28"). InnerVolumeSpecName "kube-api-access-8skkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:03:26 crc kubenswrapper[4939]: E0318 17:03:26.175833 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f\": container with ID starting with 83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f not found: ID does not exist" containerID="83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.175888 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f"} err="failed to get container status \"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f\": rpc error: code = NotFound desc = could not find container \"83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f\": container with ID starting with 83f584c0c8276bdd728a85658ff66c50f604475d13af26f9d3af9cc9b958547f not found: ID does not exist" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.175927 4939 scope.go:117] "RemoveContainer" containerID="ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535" Mar 18 17:03:26 crc kubenswrapper[4939]: E0318 17:03:26.176302 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535\": container with ID starting with ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535 not found: ID does not exist" containerID="ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.176331 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535"} err="failed to get container status \"ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535\": rpc error: code = NotFound desc = could not find container \"ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535\": container with ID starting with ef43064ad5c0f64fbb08eeb60d27348febfe1b311e6d3197ba6d0fb806a11535 not found: ID does not exist" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.201949 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02df0286-af33-43e5-ab78-b99f88ecdf28" (UID: "02df0286-af33-43e5-ab78-b99f88ecdf28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.268827 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.268872 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8skkk\" (UniqueName: \"kubernetes.io/projected/02df0286-af33-43e5-ab78-b99f88ecdf28-kube-api-access-8skkk\") on node \"crc\" DevicePath \"\"" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.268885 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02df0286-af33-43e5-ab78-b99f88ecdf28-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.436066 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:26 crc kubenswrapper[4939]: I0318 17:03:26.442159 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q8r8"] Mar 18 17:03:28 crc kubenswrapper[4939]: I0318 17:03:28.144693 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" path="/var/lib/kubelet/pods/02df0286-af33-43e5-ab78-b99f88ecdf28/volumes" Mar 18 17:03:53 crc kubenswrapper[4939]: I0318 17:03:53.687424 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:03:53 crc kubenswrapper[4939]: I0318 17:03:53.688101 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.152124 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564224-l54gq"] Mar 18 17:04:00 crc kubenswrapper[4939]: E0318 17:04:00.153104 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="extract-content" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.153120 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="extract-content" Mar 18 17:04:00 crc kubenswrapper[4939]: E0318 17:04:00.153146 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="extract-utilities" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.153152 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="extract-utilities" Mar 18 17:04:00 crc kubenswrapper[4939]: E0318 17:04:00.153164 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="registry-server" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.153171 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="registry-server" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.153369 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="02df0286-af33-43e5-ab78-b99f88ecdf28" containerName="registry-server" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.153996 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.156081 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.156551 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.157463 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.161300 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-l54gq"] Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.213039 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvb8\" (UniqueName: \"kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8\") pod \"auto-csr-approver-29564224-l54gq\" (UID: \"40b244dd-f9c0-477d-9e0e-04d9ce5231f5\") " pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.314284 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvb8\" (UniqueName: \"kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8\") pod \"auto-csr-approver-29564224-l54gq\" (UID: \"40b244dd-f9c0-477d-9e0e-04d9ce5231f5\") " pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.336053 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvb8\" (UniqueName: \"kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8\") pod \"auto-csr-approver-29564224-l54gq\" (UID: \"40b244dd-f9c0-477d-9e0e-04d9ce5231f5\") " pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.475339 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:00 crc kubenswrapper[4939]: I0318 17:04:00.913085 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-l54gq"] Mar 18 17:04:01 crc kubenswrapper[4939]: I0318 17:04:01.376816 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-l54gq" event={"ID":"40b244dd-f9c0-477d-9e0e-04d9ce5231f5","Type":"ContainerStarted","Data":"6904e2f62e399ab9e6907977ec8c35d8656204e8543bb62264a8c90de8eb4521"} Mar 18 17:04:02 crc kubenswrapper[4939]: I0318 17:04:02.386098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-l54gq" event={"ID":"40b244dd-f9c0-477d-9e0e-04d9ce5231f5","Type":"ContainerStarted","Data":"d4a058bea7da8f9cba51cbbc5120786772d44ff98761baa7ce5ff1c31fc83830"} Mar 18 17:04:02 crc kubenswrapper[4939]: I0318 17:04:02.404996 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564224-l54gq" podStartSLOduration=1.2650239349999999 podStartE2EDuration="2.404977577s" podCreationTimestamp="2026-03-18 17:04:00 +0000 UTC" firstStartedPulling="2026-03-18 17:04:00.910570535 +0000 UTC m=+5205.509758156" lastFinishedPulling="2026-03-18 17:04:02.050524177 +0000 UTC m=+5206.649711798" observedRunningTime="2026-03-18 17:04:02.399678376 +0000 UTC m=+5206.998865997" watchObservedRunningTime="2026-03-18 17:04:02.404977577 +0000 UTC m=+5207.004165198" Mar 18 17:04:03 crc kubenswrapper[4939]: I0318 17:04:03.397981 4939 generic.go:334] "Generic (PLEG): container finished" podID="40b244dd-f9c0-477d-9e0e-04d9ce5231f5" containerID="d4a058bea7da8f9cba51cbbc5120786772d44ff98761baa7ce5ff1c31fc83830" exitCode=0 Mar 18 17:04:03 crc kubenswrapper[4939]: I0318 17:04:03.398140 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-l54gq" event={"ID":"40b244dd-f9c0-477d-9e0e-04d9ce5231f5","Type":"ContainerDied","Data":"d4a058bea7da8f9cba51cbbc5120786772d44ff98761baa7ce5ff1c31fc83830"} Mar 18 17:04:04 crc kubenswrapper[4939]: I0318 17:04:04.686587 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:04 crc kubenswrapper[4939]: I0318 17:04:04.784360 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swvb8\" (UniqueName: \"kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8\") pod \"40b244dd-f9c0-477d-9e0e-04d9ce5231f5\" (UID: \"40b244dd-f9c0-477d-9e0e-04d9ce5231f5\") " Mar 18 17:04:04 crc kubenswrapper[4939]: I0318 17:04:04.792959 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8" (OuterVolumeSpecName: "kube-api-access-swvb8") pod "40b244dd-f9c0-477d-9e0e-04d9ce5231f5" (UID: "40b244dd-f9c0-477d-9e0e-04d9ce5231f5"). InnerVolumeSpecName "kube-api-access-swvb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:04:04 crc kubenswrapper[4939]: I0318 17:04:04.887124 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swvb8\" (UniqueName: \"kubernetes.io/projected/40b244dd-f9c0-477d-9e0e-04d9ce5231f5-kube-api-access-swvb8\") on node \"crc\" DevicePath \"\"" Mar 18 17:04:05 crc kubenswrapper[4939]: I0318 17:04:05.416149 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-l54gq" event={"ID":"40b244dd-f9c0-477d-9e0e-04d9ce5231f5","Type":"ContainerDied","Data":"6904e2f62e399ab9e6907977ec8c35d8656204e8543bb62264a8c90de8eb4521"} Mar 18 17:04:05 crc kubenswrapper[4939]: I0318 17:04:05.416460 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6904e2f62e399ab9e6907977ec8c35d8656204e8543bb62264a8c90de8eb4521" Mar 18 17:04:05 crc kubenswrapper[4939]: I0318 17:04:05.416262 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-l54gq" Mar 18 17:04:05 crc kubenswrapper[4939]: I0318 17:04:05.469929 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-xsnw5"] Mar 18 17:04:05 crc kubenswrapper[4939]: I0318 17:04:05.474133 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-xsnw5"] Mar 18 17:04:06 crc kubenswrapper[4939]: I0318 17:04:06.143361 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13c38bf-7e9d-4280-964b-0ca16f37c246" path="/var/lib/kubelet/pods/e13c38bf-7e9d-4280-964b-0ca16f37c246/volumes" Mar 18 17:04:23 crc kubenswrapper[4939]: I0318 17:04:23.688066 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:04:23 crc kubenswrapper[4939]: I0318 17:04:23.688782 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:04:23 crc kubenswrapper[4939]: I0318 17:04:23.688858 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:04:23 crc kubenswrapper[4939]: I0318 17:04:23.689852 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:04:23 crc kubenswrapper[4939]: I0318 17:04:23.689980 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" gracePeriod=600 Mar 18 17:04:23 crc kubenswrapper[4939]: E0318 17:04:23.851831 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:04:24 crc kubenswrapper[4939]: I0318 17:04:24.569054 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" exitCode=0 Mar 18 17:04:24 crc kubenswrapper[4939]: I0318 17:04:24.569138 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c"} Mar 18 17:04:24 crc kubenswrapper[4939]: I0318 17:04:24.569480 4939 scope.go:117] "RemoveContainer" containerID="f9378b7f2afbee1d788edac685c866cfb5d75c87de68ab0c8371f674ebdf00d5" Mar 18 17:04:24 crc kubenswrapper[4939]: I0318 17:04:24.570067 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:04:24 crc kubenswrapper[4939]: E0318 17:04:24.570385 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:04:39 crc kubenswrapper[4939]: I0318 17:04:39.133585 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:04:39 crc kubenswrapper[4939]: E0318 17:04:39.134235 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:04:50 crc kubenswrapper[4939]: I0318 17:04:50.133013 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:04:50 crc kubenswrapper[4939]: E0318 17:04:50.133774 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.223977 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 17:04:52 crc kubenswrapper[4939]: E0318 17:04:52.224320 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b244dd-f9c0-477d-9e0e-04d9ce5231f5" containerName="oc" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.224332 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b244dd-f9c0-477d-9e0e-04d9ce5231f5" containerName="oc" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.224465 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b244dd-f9c0-477d-9e0e-04d9ce5231f5" containerName="oc" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.226534 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.231045 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lhwrp" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.234696 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.390460 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswcx\" (UniqueName: \"kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.390537 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.492187 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wswcx\" (UniqueName: \"kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.492238 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.495748 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.495787 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cfc19043668c8d460c50a7aa1157d2b9aef14aa490155b65b0d02cdc9b58c8f7/globalmount\"" pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.513483 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswcx\" (UniqueName: \"kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.534229 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") pod \"mariadb-copy-data\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " pod="openstack/mariadb-copy-data" Mar 18 17:04:52 crc kubenswrapper[4939]: I0318 17:04:52.556211 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 17:04:53 crc kubenswrapper[4939]: I0318 17:04:53.081741 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 17:04:53 crc kubenswrapper[4939]: W0318 17:04:53.084140 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac0c304_d5b1_4255_8ff6_6902662b9e37.slice/crio-99a6c8197d43a324ce61f025880961ae18bad2d3b43a9ec61ca5b8f5bee64ebf WatchSource:0}: Error finding container 99a6c8197d43a324ce61f025880961ae18bad2d3b43a9ec61ca5b8f5bee64ebf: Status 404 returned error can't find the container with id 99a6c8197d43a324ce61f025880961ae18bad2d3b43a9ec61ca5b8f5bee64ebf Mar 18 17:04:53 crc kubenswrapper[4939]: I0318 17:04:53.809776 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3ac0c304-d5b1-4255-8ff6-6902662b9e37","Type":"ContainerStarted","Data":"a68da5733d5ce5776bf61c00c581efe98130866281825ae58c5d7aa958dd897f"} Mar 18 17:04:53 crc kubenswrapper[4939]: I0318 17:04:53.810862 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3ac0c304-d5b1-4255-8ff6-6902662b9e37","Type":"ContainerStarted","Data":"99a6c8197d43a324ce61f025880961ae18bad2d3b43a9ec61ca5b8f5bee64ebf"} Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.424129 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=5.42410342 podStartE2EDuration="5.42410342s" podCreationTimestamp="2026-03-18 17:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:04:53.82516692 +0000 UTC m=+5258.424354541" watchObservedRunningTime="2026-03-18 17:04:56.42410342 +0000 UTC m=+5261.023291041" Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.437258 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.439395 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.449177 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.561052 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb5d4\" (UniqueName: \"kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4\") pod \"mariadb-client\" (UID: \"9a19eb40-f854-4b59-8e09-486965e2980f\") " pod="openstack/mariadb-client" Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.662073 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb5d4\" (UniqueName: \"kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4\") pod \"mariadb-client\" (UID: \"9a19eb40-f854-4b59-8e09-486965e2980f\") " pod="openstack/mariadb-client" Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.679560 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb5d4\" (UniqueName: \"kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4\") pod \"mariadb-client\" (UID: \"9a19eb40-f854-4b59-8e09-486965e2980f\") " pod="openstack/mariadb-client" Mar 18 17:04:56 crc kubenswrapper[4939]: I0318 17:04:56.762302 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:57 crc kubenswrapper[4939]: I0318 17:04:57.178373 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:57 crc kubenswrapper[4939]: W0318 17:04:57.182706 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a19eb40_f854_4b59_8e09_486965e2980f.slice/crio-26d8e856b3de3289a20bc5c149b71924a62bcd23852499cac2651d5e49d2c8fe WatchSource:0}: Error finding container 26d8e856b3de3289a20bc5c149b71924a62bcd23852499cac2651d5e49d2c8fe: Status 404 returned error can't find the container with id 26d8e856b3de3289a20bc5c149b71924a62bcd23852499cac2651d5e49d2c8fe Mar 18 17:04:57 crc kubenswrapper[4939]: I0318 17:04:57.851778 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a19eb40-f854-4b59-8e09-486965e2980f" containerID="283b523446ee5bfe06bc81b94086b4e5cf135e2a7a17d84c2afcad944d512453" exitCode=0 Mar 18 17:04:57 crc kubenswrapper[4939]: I0318 17:04:57.851883 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9a19eb40-f854-4b59-8e09-486965e2980f","Type":"ContainerDied","Data":"283b523446ee5bfe06bc81b94086b4e5cf135e2a7a17d84c2afcad944d512453"} Mar 18 17:04:57 crc kubenswrapper[4939]: I0318 17:04:57.851982 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9a19eb40-f854-4b59-8e09-486965e2980f","Type":"ContainerStarted","Data":"26d8e856b3de3289a20bc5c149b71924a62bcd23852499cac2651d5e49d2c8fe"} Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.158542 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.184014 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_9a19eb40-f854-4b59-8e09-486965e2980f/mariadb-client/0.log" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.210178 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.216538 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.308436 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb5d4\" (UniqueName: \"kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4\") pod \"9a19eb40-f854-4b59-8e09-486965e2980f\" (UID: \"9a19eb40-f854-4b59-8e09-486965e2980f\") " Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.318659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4" (OuterVolumeSpecName: "kube-api-access-jb5d4") pod "9a19eb40-f854-4b59-8e09-486965e2980f" (UID: "9a19eb40-f854-4b59-8e09-486965e2980f"). InnerVolumeSpecName "kube-api-access-jb5d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.350943 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:59 crc kubenswrapper[4939]: E0318 17:04:59.351486 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19eb40-f854-4b59-8e09-486965e2980f" containerName="mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.351617 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19eb40-f854-4b59-8e09-486965e2980f" containerName="mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.352757 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19eb40-f854-4b59-8e09-486965e2980f" containerName="mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.353746 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.361943 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.410435 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb5d4\" (UniqueName: \"kubernetes.io/projected/9a19eb40-f854-4b59-8e09-486965e2980f-kube-api-access-jb5d4\") on node \"crc\" DevicePath \"\"" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.512756 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxdd\" (UniqueName: \"kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd\") pod \"mariadb-client\" (UID: \"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d\") " pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.614055 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxdd\" (UniqueName: \"kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd\") pod \"mariadb-client\" (UID: \"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d\") " pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.634230 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxdd\" (UniqueName: \"kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd\") pod \"mariadb-client\" (UID: \"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d\") " pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.685781 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.868594 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26d8e856b3de3289a20bc5c149b71924a62bcd23852499cac2651d5e49d2c8fe" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.868655 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:04:59 crc kubenswrapper[4939]: I0318 17:04:59.886657 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="9a19eb40-f854-4b59-8e09-486965e2980f" podUID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.118751 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:05:00 crc kubenswrapper[4939]: W0318 17:05:00.121665 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c1ad9ac_0437_4393_837e_3ce5d3e97f1d.slice/crio-3b4dba9b5dba916cbe8aaaec8fcf509181d10569374cef0a7b2db7c39a28a9f0 WatchSource:0}: Error finding container 3b4dba9b5dba916cbe8aaaec8fcf509181d10569374cef0a7b2db7c39a28a9f0: Status 404 returned error can't find the container with id 3b4dba9b5dba916cbe8aaaec8fcf509181d10569374cef0a7b2db7c39a28a9f0 Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.144451 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a19eb40-f854-4b59-8e09-486965e2980f" path="/var/lib/kubelet/pods/9a19eb40-f854-4b59-8e09-486965e2980f/volumes" Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.388710 4939 scope.go:117] "RemoveContainer" containerID="5c1c9d470cc2cca96fc76b77ed69681366aefd62b2ab3f48372cee0aa9be214f" Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.969452 4939 generic.go:334] "Generic (PLEG): container finished" podID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" containerID="59c94bfa1ffe712f4d5ea350da6ddad9bcf5b6086d2564361780aed58f93bac0" exitCode=0 Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.969542 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d","Type":"ContainerDied","Data":"59c94bfa1ffe712f4d5ea350da6ddad9bcf5b6086d2564361780aed58f93bac0"} Mar 18 17:05:00 crc kubenswrapper[4939]: I0318 17:05:00.969596 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d","Type":"ContainerStarted","Data":"3b4dba9b5dba916cbe8aaaec8fcf509181d10569374cef0a7b2db7c39a28a9f0"} Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.313059 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.332138 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_1c1ad9ac-0437-4393-837e-3ce5d3e97f1d/mariadb-client/0.log" Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.364637 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.368600 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.389684 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxdd\" (UniqueName: \"kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd\") pod \"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d\" (UID: \"1c1ad9ac-0437-4393-837e-3ce5d3e97f1d\") " Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.396076 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd" (OuterVolumeSpecName: "kube-api-access-msxdd") pod "1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" (UID: "1c1ad9ac-0437-4393-837e-3ce5d3e97f1d"). InnerVolumeSpecName "kube-api-access-msxdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:05:02 crc kubenswrapper[4939]: I0318 17:05:02.491274 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxdd\" (UniqueName: \"kubernetes.io/projected/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d-kube-api-access-msxdd\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:03 crc kubenswrapper[4939]: I0318 17:05:03.043493 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b4dba9b5dba916cbe8aaaec8fcf509181d10569374cef0a7b2db7c39a28a9f0" Mar 18 17:05:03 crc kubenswrapper[4939]: I0318 17:05:03.043646 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 17:05:03 crc kubenswrapper[4939]: I0318 17:05:03.133627 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:05:03 crc kubenswrapper[4939]: E0318 17:05:03.134080 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:05:04 crc kubenswrapper[4939]: I0318 17:05:04.144309 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" path="/var/lib/kubelet/pods/1c1ad9ac-0437-4393-837e-3ce5d3e97f1d/volumes" Mar 18 17:05:17 crc kubenswrapper[4939]: I0318 17:05:17.133558 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:05:17 crc kubenswrapper[4939]: E0318 17:05:17.135175 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:05:31 crc kubenswrapper[4939]: I0318 17:05:31.133154 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:05:31 crc kubenswrapper[4939]: E0318 17:05:31.134005 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.932650 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 17:05:34 crc kubenswrapper[4939]: E0318 17:05:34.933360 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" containerName="mariadb-client" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.933378 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" containerName="mariadb-client" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.933591 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1ad9ac-0437-4393-837e-3ce5d3e97f1d" containerName="mariadb-client" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.934586 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.937468 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rwkp9" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.939984 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.943628 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.969215 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.982297 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.983578 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:34 crc kubenswrapper[4939]: I0318 17:05:34.999570 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.000882 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.015570 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.039579 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098277 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dps\" (UniqueName: \"kubernetes.io/projected/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-kube-api-access-z6dps\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098328 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098361 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-config\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098379 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2118473b-6c74-4093-bba9-cc8ea59c632d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098410 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dc4642e-c9bd-42a3-81d5-010aee0538b9-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098427 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-422b1067-aa32-448e-a35d-4628c0821e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b1067-aa32-448e-a35d-4628c0821e1d\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098443 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098461 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc4642e-c9bd-42a3-81d5-010aee0538b9-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098492 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098535 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmsx\" (UniqueName: \"kubernetes.io/projected/2118473b-6c74-4093-bba9-cc8ea59c632d-kube-api-access-8hmsx\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098553 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098577 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098607 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098630 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098645 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2118473b-6c74-4093-bba9-cc8ea59c632d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098665 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-config\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.098682 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxgj\" (UniqueName: \"kubernetes.io/projected/3dc4642e-c9bd-42a3-81d5-010aee0538b9-kube-api-access-rvxgj\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.158028 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.159404 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.163449 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-r6np7" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.163663 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.163933 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.188340 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.191192 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199517 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-config\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199564 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxgj\" (UniqueName: \"kubernetes.io/projected/3dc4642e-c9bd-42a3-81d5-010aee0538b9-kube-api-access-rvxgj\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199590 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dps\" (UniqueName: \"kubernetes.io/projected/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-kube-api-access-z6dps\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199619 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199694 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-config\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199720 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2118473b-6c74-4093-bba9-cc8ea59c632d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199774 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dc4642e-c9bd-42a3-81d5-010aee0538b9-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199796 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-422b1067-aa32-448e-a35d-4628c0821e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b1067-aa32-448e-a35d-4628c0821e1d\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199816 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199840 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199857 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc4642e-c9bd-42a3-81d5-010aee0538b9-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199875 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmsx\" (UniqueName: \"kubernetes.io/projected/2118473b-6c74-4093-bba9-cc8ea59c632d-kube-api-access-8hmsx\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199941 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.199982 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200017 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200051 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200076 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2118473b-6c74-4093-bba9-cc8ea59c632d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200312 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200617 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-config\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dc4642e-c9bd-42a3-81d5-010aee0538b9-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.200706 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2118473b-6c74-4093-bba9-cc8ea59c632d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.201576 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-config\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.201837 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2118473b-6c74-4093-bba9-cc8ea59c632d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.204936 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.205146 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-422b1067-aa32-448e-a35d-4628c0821e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b1067-aa32-448e-a35d-4628c0821e1d\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29dceeb35ddb48949056ecb276c3e90541968f96309fb56bdee98ba1db539acc/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.207688 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.208596 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.208644 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.209363 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.209386 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/19481e393b9024b5aa02703b41b2ed6b2cb2a54fb4357fb4a5a7091bc903d7c7/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.210247 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dc4642e-c9bd-42a3-81d5-010aee0538b9-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.215488 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc4642e-c9bd-42a3-81d5-010aee0538b9-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.215711 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2118473b-6c74-4093-bba9-cc8ea59c632d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.216200 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.216229 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c4f0f47e1b3aee9e0543fbc4436ed72681bb0329d354a98453bd063762ed2acd/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.219472 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.220700 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.221588 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dps\" (UniqueName: \"kubernetes.io/projected/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-kube-api-access-z6dps\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.224001 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc767c7b-3dfc-49d6-bfd0-310c15ec7369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.230410 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmsx\" (UniqueName: \"kubernetes.io/projected/2118473b-6c74-4093-bba9-cc8ea59c632d-kube-api-access-8hmsx\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.231372 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.239037 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxgj\" (UniqueName: \"kubernetes.io/projected/3dc4642e-c9bd-42a3-81d5-010aee0538b9-kube-api-access-rvxgj\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.240014 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.273824 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-422b1067-aa32-448e-a35d-4628c0821e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b1067-aa32-448e-a35d-4628c0821e1d\") pod \"ovsdbserver-nb-0\" (UID: \"dc767c7b-3dfc-49d6-bfd0-310c15ec7369\") " pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.285452 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-976bf8df-9900-48d6-96d5-5a07d32463eb\") pod \"ovsdbserver-nb-1\" (UID: \"2118473b-6c74-4093-bba9-cc8ea59c632d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.288613 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-358720c2-6ddb-4f93-8ebc-761f79382d79\") pod \"ovsdbserver-nb-2\" (UID: \"3dc4642e-c9bd-42a3-81d5-010aee0538b9\") " pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.300935 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh8sh\" (UniqueName: \"kubernetes.io/projected/377cdcbe-59e9-4a36-9528-a72cf359e07b-kube-api-access-wh8sh\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.300992 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-config\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301029 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301057 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmkh\" (UniqueName: \"kubernetes.io/projected/4e6fca10-8968-4453-9789-4d81b241977e-kube-api-access-zhmkh\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301075 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301221 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e6fca10-8968-4453-9789-4d81b241977e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301250 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58cp\" (UniqueName: \"kubernetes.io/projected/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-kube-api-access-r58cp\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301288 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-config\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301439 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-98741b01-357b-47ff-a652-9ffa7a521f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98741b01-357b-47ff-a652-9ffa7a521f65\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301479 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/377cdcbe-59e9-4a36-9528-a72cf359e07b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301529 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301599 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-config\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301646 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301682 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301703 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377cdcbe-59e9-4a36-9528-a72cf359e07b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.301729 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fca10-8968-4453-9789-4d81b241977e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.309818 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.327158 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403811 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403845 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-config\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403862 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403894 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377cdcbe-59e9-4a36-9528-a72cf359e07b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403935 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fca10-8968-4453-9789-4d81b241977e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.403977 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh8sh\" (UniqueName: \"kubernetes.io/projected/377cdcbe-59e9-4a36-9528-a72cf359e07b-kube-api-access-wh8sh\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.405171 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-config\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.405248 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-config\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.405300 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.405330 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmkh\" (UniqueName: \"kubernetes.io/projected/4e6fca10-8968-4453-9789-4d81b241977e-kube-api-access-zhmkh\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.405411 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406266 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-config\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406423 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406633 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406657 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406675 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e6fca10-8968-4453-9789-4d81b241977e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406715 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58cp\" (UniqueName: \"kubernetes.io/projected/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-kube-api-access-r58cp\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.406734 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-config\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.407792 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e6fca10-8968-4453-9789-4d81b241977e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.408124 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.409930 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.410590 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-98741b01-357b-47ff-a652-9ffa7a521f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98741b01-357b-47ff-a652-9ffa7a521f65\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.410657 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/377cdcbe-59e9-4a36-9528-a72cf359e07b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.410712 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.411928 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.411965 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/377cdcbe-59e9-4a36-9528-a72cf359e07b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.411974 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/22a0f89c2a855e0babaa84ae62ce6f7772c557664e0fc583a212bd3e1ba027e1/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.412415 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6fca10-8968-4453-9789-4d81b241977e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.412523 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e6fca10-8968-4453-9789-4d81b241977e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.412759 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.412788 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-98741b01-357b-47ff-a652-9ffa7a521f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98741b01-357b-47ff-a652-9ffa7a521f65\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/626153c0550b64a79d39c2fcf0eba0007fa9abcb185d7c8715c9ce65a47bcaa5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.413407 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.413456 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fe2b384c690828e023c36ad6a844632e169a69a9c0916b800c7002c115145f58/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.421717 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh8sh\" (UniqueName: \"kubernetes.io/projected/377cdcbe-59e9-4a36-9528-a72cf359e07b-kube-api-access-wh8sh\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.423967 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377cdcbe-59e9-4a36-9528-a72cf359e07b-config\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.424054 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377cdcbe-59e9-4a36-9528-a72cf359e07b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.429260 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmkh\" (UniqueName: \"kubernetes.io/projected/4e6fca10-8968-4453-9789-4d81b241977e-kube-api-access-zhmkh\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.438662 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58cp\" (UniqueName: \"kubernetes.io/projected/7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef-kube-api-access-r58cp\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.449620 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5966c999-e6e5-4fb7-a875-ee5eb12fba1d\") pod \"ovsdbserver-sb-2\" (UID: \"4e6fca10-8968-4453-9789-4d81b241977e\") " pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.457989 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-98741b01-357b-47ff-a652-9ffa7a521f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98741b01-357b-47ff-a652-9ffa7a521f65\") pod \"ovsdbserver-sb-1\" (UID: \"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef\") " pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.483889 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae44c139-cbaa-4b0b-884b-9e1e6ccea525\") pod \"ovsdbserver-sb-0\" (UID: \"377cdcbe-59e9-4a36-9528-a72cf359e07b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.520577 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.551263 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.626211 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.785156 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.847970 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 17:05:35 crc kubenswrapper[4939]: I0318 17:05:35.964581 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.064716 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.196495 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.304874 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.317070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3dc4642e-c9bd-42a3-81d5-010aee0538b9","Type":"ContainerStarted","Data":"b5eab4cce1193cfb2033bcd0b774424359be18bd5e36565daf1591da88cac925"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.317117 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3dc4642e-c9bd-42a3-81d5-010aee0538b9","Type":"ContainerStarted","Data":"c1c5d3f861d339c98bc1ba509efdc66873c241f2c5abe70f8d8f9fc9722e0188"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.317128 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3dc4642e-c9bd-42a3-81d5-010aee0538b9","Type":"ContainerStarted","Data":"2c395d0ceab2ada76e687e20e7e11c925388beff2cdcabd0af6469ad4f8df529"} Mar 18 17:05:36 crc kubenswrapper[4939]: W0318 17:05:36.319808 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e6fca10_8968_4453_9789_4d81b241977e.slice/crio-b4888bd19612c17d907f0058d8a95853b610419995764c5ad4c0273c3ecd00ab WatchSource:0}: Error finding container b4888bd19612c17d907f0058d8a95853b610419995764c5ad4c0273c3ecd00ab: Status 404 returned error can't find the container with id b4888bd19612c17d907f0058d8a95853b610419995764c5ad4c0273c3ecd00ab Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.321219 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef","Type":"ContainerStarted","Data":"2e366230a9779d52027ec9dacaf2204967a82ee8816662e29335f56eca2ae80b"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.321284 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef","Type":"ContainerStarted","Data":"94208ad380f1e3b2f2d755e6f998c2faba4295df6f8244bcf074b13b294edc9d"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.324710 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2118473b-6c74-4093-bba9-cc8ea59c632d","Type":"ContainerStarted","Data":"0cb987ab00cfb6317df81cdbb2123ae1db8d11737407426c5edad4a7c4cd8054"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.324761 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2118473b-6c74-4093-bba9-cc8ea59c632d","Type":"ContainerStarted","Data":"fd3b2963760fa5662820ae9d47da44bb543c4345fa5c3f5e6e646575b0e31064"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.326581 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc767c7b-3dfc-49d6-bfd0-310c15ec7369","Type":"ContainerStarted","Data":"a3f014183bbdca00c935b091db7bc46668e21c810029b58bffd3397002bcda17"} Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.342626 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.342600039 podStartE2EDuration="3.342600039s" podCreationTimestamp="2026-03-18 17:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:36.334979293 +0000 UTC m=+5300.934166914" watchObservedRunningTime="2026-03-18 17:05:36.342600039 +0000 UTC m=+5300.941787660" Mar 18 17:05:36 crc kubenswrapper[4939]: I0318 17:05:36.373156 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 17:05:36 crc kubenswrapper[4939]: W0318 17:05:36.391450 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377cdcbe_59e9_4a36_9528_a72cf359e07b.slice/crio-b099ffd870604518bad4a38a22ba3a7cd5a7f2a591693e6398df83e3c899c8c0 WatchSource:0}: Error finding container b099ffd870604518bad4a38a22ba3a7cd5a7f2a591693e6398df83e3c899c8c0: Status 404 returned error can't find the container with id b099ffd870604518bad4a38a22ba3a7cd5a7f2a591693e6398df83e3c899c8c0 Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.336166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4e6fca10-8968-4453-9789-4d81b241977e","Type":"ContainerStarted","Data":"0f755f055231e6e25f8cc1068c9ac3c8293f6507743b771cb4c0a96b2e317a0f"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.336734 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4e6fca10-8968-4453-9789-4d81b241977e","Type":"ContainerStarted","Data":"88eaa2e2880bcd23438a1060b2ab009bf37c16cf1d70eebb0bef37af23a4835c"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.336753 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4e6fca10-8968-4453-9789-4d81b241977e","Type":"ContainerStarted","Data":"b4888bd19612c17d907f0058d8a95853b610419995764c5ad4c0273c3ecd00ab"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.338756 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"377cdcbe-59e9-4a36-9528-a72cf359e07b","Type":"ContainerStarted","Data":"0f6b61499c19be3482ecc680390043486e7faf04c00e25c932c0ddc94b66fbf7"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.338805 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"377cdcbe-59e9-4a36-9528-a72cf359e07b","Type":"ContainerStarted","Data":"462f639bbf56b335ba7d6664dd28b25f19621db5ccf44d3158f97848493bc957"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.338819 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"377cdcbe-59e9-4a36-9528-a72cf359e07b","Type":"ContainerStarted","Data":"b099ffd870604518bad4a38a22ba3a7cd5a7f2a591693e6398df83e3c899c8c0"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.341763 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef","Type":"ContainerStarted","Data":"988906dc3d28ae7b746e7cde159f55c43b641d398fd5a9ae8aba612bf78d698b"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.345033 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2118473b-6c74-4093-bba9-cc8ea59c632d","Type":"ContainerStarted","Data":"fc1866dccb8dc54f5919340f28b578b179b54504429e0188e208902150ef9559"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.347538 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc767c7b-3dfc-49d6-bfd0-310c15ec7369","Type":"ContainerStarted","Data":"252d93648ef7aaa8fa1156e4b4a013e81c84651eef8bf3a9b8c1b645b2e6d5e6"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.347561 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc767c7b-3dfc-49d6-bfd0-310c15ec7369","Type":"ContainerStarted","Data":"a504cf4fa25269afec5876c4747d0dfa9322d68a5749e39707090bf75bb08ba3"} Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.357312 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.357287437 podStartE2EDuration="3.357287437s" podCreationTimestamp="2026-03-18 17:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:37.351527623 +0000 UTC m=+5301.950715244" watchObservedRunningTime="2026-03-18 17:05:37.357287437 +0000 UTC m=+5301.956475048" Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.376425 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.376403899 podStartE2EDuration="4.376403899s" podCreationTimestamp="2026-03-18 17:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:37.372482798 +0000 UTC m=+5301.971670429" watchObservedRunningTime="2026-03-18 17:05:37.376403899 +0000 UTC m=+5301.975591521" Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.393250 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.393221977 podStartE2EDuration="3.393221977s" podCreationTimestamp="2026-03-18 17:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:37.390489139 +0000 UTC m=+5301.989676760" watchObservedRunningTime="2026-03-18 17:05:37.393221977 +0000 UTC m=+5301.992409638" Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.412224 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.412209146 podStartE2EDuration="4.412209146s" podCreationTimestamp="2026-03-18 17:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:37.411130185 +0000 UTC m=+5302.010317816" watchObservedRunningTime="2026-03-18 17:05:37.412209146 +0000 UTC m=+5302.011396767" Mar 18 17:05:37 crc kubenswrapper[4939]: I0318 17:05:37.437880 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.437855244 podStartE2EDuration="3.437855244s" podCreationTimestamp="2026-03-18 17:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:37.433591882 +0000 UTC m=+5302.032779503" watchObservedRunningTime="2026-03-18 17:05:37.437855244 +0000 UTC m=+5302.037042865" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.310435 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.328318 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.520662 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.551869 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.630712 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:38 crc kubenswrapper[4939]: I0318 17:05:38.785716 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.310327 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.327344 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.521125 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.552479 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.630587 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:40 crc kubenswrapper[4939]: I0318 17:05:40.785457 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.345462 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.369135 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.418445 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.433218 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.583655 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.620446 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.645102 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.692585 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.697560 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.698986 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.700841 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.710774 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.715190 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.751439 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.829574 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.830174 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.831018 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.831060 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl5bp\" (UniqueName: \"kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.831138 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.888650 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.933462 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.933530 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl5bp\" (UniqueName: \"kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.933601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.933728 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.935050 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.935197 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.935814 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:41 crc kubenswrapper[4939]: I0318 17:05:41.966245 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl5bp\" (UniqueName: \"kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp\") pod \"dnsmasq-dns-749675b4c7-zmc6z\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.035014 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.041443 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.084127 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.085717 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.099575 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.100650 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.239417 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.239822 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.239860 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.239882 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.239955 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcks\" (UniqueName: \"kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.341577 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.341622 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.341642 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.341693 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcks\" (UniqueName: \"kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.341751 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.342799 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.342907 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.342919 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.343400 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.361580 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcks\" (UniqueName: \"kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks\") pod \"dnsmasq-dns-74f6fc965c-f96p8\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.454339 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.544630 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:42 crc kubenswrapper[4939]: W0318 17:05:42.548378 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0d98e08_51fd_4d2a_9d2a_e07da2bd4824.slice/crio-ce3121ed8718d62f42205ff1a57922102261bd2b762749d5edd760ba48257b9b WatchSource:0}: Error finding container ce3121ed8718d62f42205ff1a57922102261bd2b762749d5edd760ba48257b9b: Status 404 returned error can't find the container with id ce3121ed8718d62f42205ff1a57922102261bd2b762749d5edd760ba48257b9b Mar 18 17:05:42 crc kubenswrapper[4939]: I0318 17:05:42.894218 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:05:42 crc kubenswrapper[4939]: W0318 17:05:42.897526 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819d0254_8d11_40e3_b54f_7a4f6b3c88b2.slice/crio-5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4 WatchSource:0}: Error finding container 5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4: Status 404 returned error can't find the container with id 5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4 Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.412531 4939 generic.go:334] "Generic (PLEG): container finished" podID="b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" containerID="c3a074f8d4032830b4a9b9478703c56111e20d878d97d0d98f3167f6165d0df0" exitCode=0 Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.412645 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" event={"ID":"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824","Type":"ContainerDied","Data":"c3a074f8d4032830b4a9b9478703c56111e20d878d97d0d98f3167f6165d0df0"} Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.412708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" event={"ID":"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824","Type":"ContainerStarted","Data":"ce3121ed8718d62f42205ff1a57922102261bd2b762749d5edd760ba48257b9b"} Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.416002 4939 generic.go:334] "Generic (PLEG): container finished" podID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerID="7d9143295243a3b3b8ef1369a9fbb9e5c0e209be3a65e4d982bbbbf53c10e644" exitCode=0 Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.416044 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" event={"ID":"819d0254-8d11-40e3-b54f-7a4f6b3c88b2","Type":"ContainerDied","Data":"7d9143295243a3b3b8ef1369a9fbb9e5c0e209be3a65e4d982bbbbf53c10e644"} Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.416072 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" event={"ID":"819d0254-8d11-40e3-b54f-7a4f6b3c88b2","Type":"ContainerStarted","Data":"5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4"} Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.728823 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.763678 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config\") pod \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.763745 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl5bp\" (UniqueName: \"kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp\") pod \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.763791 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb\") pod \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.763843 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc\") pod \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\" (UID: \"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824\") " Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.767800 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp" (OuterVolumeSpecName: "kube-api-access-rl5bp") pod "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" (UID: "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824"). InnerVolumeSpecName "kube-api-access-rl5bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.785302 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" (UID: "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.786156 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" (UID: "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.794608 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config" (OuterVolumeSpecName: "config") pod "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" (UID: "b0d98e08-51fd-4d2a-9d2a-e07da2bd4824"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.865560 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.865595 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl5bp\" (UniqueName: \"kubernetes.io/projected/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-kube-api-access-rl5bp\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.865615 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:43 crc kubenswrapper[4939]: I0318 17:05:43.865630 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.134486 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:05:44 crc kubenswrapper[4939]: E0318 17:05:44.135033 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.433787 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" event={"ID":"b0d98e08-51fd-4d2a-9d2a-e07da2bd4824","Type":"ContainerDied","Data":"ce3121ed8718d62f42205ff1a57922102261bd2b762749d5edd760ba48257b9b"} Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.433814 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-zmc6z" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.433861 4939 scope.go:117] "RemoveContainer" containerID="c3a074f8d4032830b4a9b9478703c56111e20d878d97d0d98f3167f6165d0df0" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.436532 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" event={"ID":"819d0254-8d11-40e3-b54f-7a4f6b3c88b2","Type":"ContainerStarted","Data":"15427c09b9481ce0a7cfb24176bb1863c7186e6fa3205aba6701446722978d93"} Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.436881 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.512163 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" podStartSLOduration=2.512138736 podStartE2EDuration="2.512138736s" podCreationTimestamp="2026-03-18 17:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:44.460989044 +0000 UTC m=+5309.060176665" watchObservedRunningTime="2026-03-18 17:05:44.512138736 +0000 UTC m=+5309.111326357" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.526903 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 18 17:05:44 crc kubenswrapper[4939]: E0318 17:05:44.527719 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" containerName="init" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.527738 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" containerName="init" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.528893 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" containerName="init" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.530917 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.533301 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.545042 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.574122 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.581762 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-zmc6z"] Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.582739 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.582802 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.582992 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfwq\" (UniqueName: \"kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.684924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.685143 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfwq\" (UniqueName: \"kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.685186 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.687987 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.688031 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53fbc9325c511757ffa4eca84332a82b70c7b88822f3b80ecb2e7fd7a3b0b3d7/globalmount\"" pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.699288 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.700974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfwq\" (UniqueName: \"kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.718817 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") pod \"ovn-copy-data\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " pod="openstack/ovn-copy-data" Mar 18 17:05:44 crc kubenswrapper[4939]: I0318 17:05:44.853459 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 17:05:45 crc kubenswrapper[4939]: I0318 17:05:45.402082 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 17:05:45 crc kubenswrapper[4939]: W0318 17:05:45.405779 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7152a687_3421_4f86_9844_74fd521bff9c.slice/crio-5f730db387bab7eba004ac5a2716eda6a3a858793c1fc8c697a2b8b146d0e805 WatchSource:0}: Error finding container 5f730db387bab7eba004ac5a2716eda6a3a858793c1fc8c697a2b8b146d0e805: Status 404 returned error can't find the container with id 5f730db387bab7eba004ac5a2716eda6a3a858793c1fc8c697a2b8b146d0e805 Mar 18 17:05:45 crc kubenswrapper[4939]: I0318 17:05:45.446339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"7152a687-3421-4f86-9844-74fd521bff9c","Type":"ContainerStarted","Data":"5f730db387bab7eba004ac5a2716eda6a3a858793c1fc8c697a2b8b146d0e805"} Mar 18 17:05:46 crc kubenswrapper[4939]: I0318 17:05:46.144876 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d98e08-51fd-4d2a-9d2a-e07da2bd4824" path="/var/lib/kubelet/pods/b0d98e08-51fd-4d2a-9d2a-e07da2bd4824/volumes" Mar 18 17:05:46 crc kubenswrapper[4939]: I0318 17:05:46.455903 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"7152a687-3421-4f86-9844-74fd521bff9c","Type":"ContainerStarted","Data":"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69"} Mar 18 17:05:46 crc kubenswrapper[4939]: I0318 17:05:46.486014 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.485982664 podStartE2EDuration="3.485982664s" podCreationTimestamp="2026-03-18 17:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:46.476389952 +0000 UTC m=+5311.075577583" watchObservedRunningTime="2026-03-18 17:05:46.485982664 +0000 UTC m=+5311.085170325" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.263759 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.286829 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.286960 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.288896 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9sq5d" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.289245 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.292854 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.395966 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-scripts\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.396015 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60736420-d745-45a9-a045-700fcbfbbeff-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.396087 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbj8\" (UniqueName: \"kubernetes.io/projected/60736420-d745-45a9-a045-700fcbfbbeff-kube-api-access-ckbj8\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.396106 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60736420-d745-45a9-a045-700fcbfbbeff-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.396125 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-config\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.498117 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-scripts\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.498183 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60736420-d745-45a9-a045-700fcbfbbeff-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.498293 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbj8\" (UniqueName: \"kubernetes.io/projected/60736420-d745-45a9-a045-700fcbfbbeff-kube-api-access-ckbj8\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.498316 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60736420-d745-45a9-a045-700fcbfbbeff-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.498335 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-config\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.499389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-config\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.500045 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60736420-d745-45a9-a045-700fcbfbbeff-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.500572 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60736420-d745-45a9-a045-700fcbfbbeff-scripts\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.506581 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60736420-d745-45a9-a045-700fcbfbbeff-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.561517 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbj8\" (UniqueName: \"kubernetes.io/projected/60736420-d745-45a9-a045-700fcbfbbeff-kube-api-access-ckbj8\") pod \"ovn-northd-0\" (UID: \"60736420-d745-45a9-a045-700fcbfbbeff\") " pod="openstack/ovn-northd-0" Mar 18 17:05:51 crc kubenswrapper[4939]: I0318 17:05:51.614817 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.093589 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 17:05:52 crc kubenswrapper[4939]: W0318 17:05:52.099414 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60736420_d745_45a9_a045_700fcbfbbeff.slice/crio-d333dd3669a10bf2b3e1635adcf08c71e523dffe1e81f950f333b53cd4c6a60e WatchSource:0}: Error finding container d333dd3669a10bf2b3e1635adcf08c71e523dffe1e81f950f333b53cd4c6a60e: Status 404 returned error can't find the container with id d333dd3669a10bf2b3e1635adcf08c71e523dffe1e81f950f333b53cd4c6a60e Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.456720 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.509203 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"60736420-d745-45a9-a045-700fcbfbbeff","Type":"ContainerStarted","Data":"81db884e69dfcfc67f9854fd335435dd946f60e0e751768c588d6b342850fc27"} Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.509258 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"60736420-d745-45a9-a045-700fcbfbbeff","Type":"ContainerStarted","Data":"ae2d29dc5c1d7fb6c0d3d22f0bc1ab8c6b101751ae546c9d2082eab8da19fd6c"} Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.509271 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"60736420-d745-45a9-a045-700fcbfbbeff","Type":"ContainerStarted","Data":"d333dd3669a10bf2b3e1635adcf08c71e523dffe1e81f950f333b53cd4c6a60e"} Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.509413 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.519217 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.520034 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="dnsmasq-dns" containerID="cri-o://fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d" gracePeriod=10 Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.544639 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.544611142 podStartE2EDuration="1.544611142s" podCreationTimestamp="2026-03-18 17:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:52.542000158 +0000 UTC m=+5317.141187779" watchObservedRunningTime="2026-03-18 17:05:52.544611142 +0000 UTC m=+5317.143798763" Mar 18 17:05:52 crc kubenswrapper[4939]: I0318 17:05:52.983379 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.126071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc\") pod \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.126212 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxxz\" (UniqueName: \"kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz\") pod \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.126280 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config\") pod \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\" (UID: \"b8031ced-8ac8-4318-87b1-2f22da0b05f1\") " Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.132356 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz" (OuterVolumeSpecName: "kube-api-access-wpxxz") pod "b8031ced-8ac8-4318-87b1-2f22da0b05f1" (UID: "b8031ced-8ac8-4318-87b1-2f22da0b05f1"). InnerVolumeSpecName "kube-api-access-wpxxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.165633 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config" (OuterVolumeSpecName: "config") pod "b8031ced-8ac8-4318-87b1-2f22da0b05f1" (UID: "b8031ced-8ac8-4318-87b1-2f22da0b05f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.165699 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8031ced-8ac8-4318-87b1-2f22da0b05f1" (UID: "b8031ced-8ac8-4318-87b1-2f22da0b05f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.228721 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.228758 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxxz\" (UniqueName: \"kubernetes.io/projected/b8031ced-8ac8-4318-87b1-2f22da0b05f1-kube-api-access-wpxxz\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.228772 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8031ced-8ac8-4318-87b1-2f22da0b05f1-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.518051 4939 generic.go:334] "Generic (PLEG): container finished" podID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerID="fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d" exitCode=0 Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.518343 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" event={"ID":"b8031ced-8ac8-4318-87b1-2f22da0b05f1","Type":"ContainerDied","Data":"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d"} Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.520000 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" event={"ID":"b8031ced-8ac8-4318-87b1-2f22da0b05f1","Type":"ContainerDied","Data":"782ca91aa2991a34c0dba0c2cff85dee1db31ae22263875f67a89c502b77990b"} Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.520091 4939 scope.go:117] "RemoveContainer" containerID="fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.518465 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-t4ggw" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.547376 4939 scope.go:117] "RemoveContainer" containerID="da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.561519 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.569002 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-t4ggw"] Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.602736 4939 scope.go:117] "RemoveContainer" containerID="fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d" Mar 18 17:05:53 crc kubenswrapper[4939]: E0318 17:05:53.603118 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d\": container with ID starting with fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d not found: ID does not exist" containerID="fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.603165 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d"} err="failed to get container status \"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d\": rpc error: code = NotFound desc = could not find container \"fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d\": container with ID starting with fe42eb9f9f30fc574c4851c6af365d6c3a5b2246e816c5bde661beed0ec70a3d not found: ID does not exist" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.603197 4939 scope.go:117] "RemoveContainer" containerID="da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff" Mar 18 17:05:53 crc kubenswrapper[4939]: E0318 17:05:53.603550 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff\": container with ID starting with da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff not found: ID does not exist" containerID="da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff" Mar 18 17:05:53 crc kubenswrapper[4939]: I0318 17:05:53.603586 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff"} err="failed to get container status \"da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff\": rpc error: code = NotFound desc = could not find container \"da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff\": container with ID starting with da00db7c60ff889413190d3dda3bdcb364da1da462f11ba0197268cd54da45ff not found: ID does not exist" Mar 18 17:05:54 crc kubenswrapper[4939]: I0318 17:05:54.142039 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" path="/var/lib/kubelet/pods/b8031ced-8ac8-4318-87b1-2f22da0b05f1/volumes" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.285346 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-d4wlf"] Mar 18 17:05:56 crc kubenswrapper[4939]: E0318 17:05:56.286171 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="init" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.286192 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="init" Mar 18 17:05:56 crc kubenswrapper[4939]: E0318 17:05:56.286211 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="dnsmasq-dns" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.286221 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="dnsmasq-dns" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.286391 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8031ced-8ac8-4318-87b1-2f22da0b05f1" containerName="dnsmasq-dns" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.287059 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.298136 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r5h6\" (UniqueName: \"kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.298274 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.300722 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-d4wlf"] Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.379604 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a8b-account-create-update-rqz88"] Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.381193 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.384167 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.399853 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.399918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlt7n\" (UniqueName: \"kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.399968 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.400075 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r5h6\" (UniqueName: \"kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.401405 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.422040 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a8b-account-create-update-rqz88"] Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.436920 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r5h6\" (UniqueName: \"kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6\") pod \"keystone-db-create-d4wlf\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.502640 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.503322 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlt7n\" (UniqueName: \"kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.503360 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.520015 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlt7n\" (UniqueName: \"kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n\") pod \"keystone-0a8b-account-create-update-rqz88\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.657616 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:56 crc kubenswrapper[4939]: I0318 17:05:56.712803 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.119017 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-d4wlf"] Mar 18 17:05:57 crc kubenswrapper[4939]: W0318 17:05:57.129801 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb72637_02c0_44b4_919a_8691529c611a.slice/crio-9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82 WatchSource:0}: Error finding container 9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82: Status 404 returned error can't find the container with id 9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82 Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.227910 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a8b-account-create-update-rqz88"] Mar 18 17:05:57 crc kubenswrapper[4939]: W0318 17:05:57.233577 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b01e44d_3e47_4d5a_9372_cda997282916.slice/crio-639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e WatchSource:0}: Error finding container 639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e: Status 404 returned error can't find the container with id 639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.555268 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a8b-account-create-update-rqz88" event={"ID":"4b01e44d-3e47-4d5a-9372-cda997282916","Type":"ContainerStarted","Data":"d4ebffa938d8b676cc2e352530bc3f2bd24bd79a295ac265382cc747f41d6033"} Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.555355 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a8b-account-create-update-rqz88" event={"ID":"4b01e44d-3e47-4d5a-9372-cda997282916","Type":"ContainerStarted","Data":"639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e"} Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.557864 4939 generic.go:334] "Generic (PLEG): container finished" podID="9bb72637-02c0-44b4-919a-8691529c611a" containerID="5e57776cb9aec52c6dfd9f5bfb920ed1a3fc5796247636b09c2901087810fa59" exitCode=0 Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.557902 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d4wlf" event={"ID":"9bb72637-02c0-44b4-919a-8691529c611a","Type":"ContainerDied","Data":"5e57776cb9aec52c6dfd9f5bfb920ed1a3fc5796247636b09c2901087810fa59"} Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.557984 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d4wlf" event={"ID":"9bb72637-02c0-44b4-919a-8691529c611a","Type":"ContainerStarted","Data":"9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82"} Mar 18 17:05:57 crc kubenswrapper[4939]: I0318 17:05:57.571472 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0a8b-account-create-update-rqz88" podStartSLOduration=1.571455706 podStartE2EDuration="1.571455706s" podCreationTimestamp="2026-03-18 17:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:57.569576043 +0000 UTC m=+5322.168763664" watchObservedRunningTime="2026-03-18 17:05:57.571455706 +0000 UTC m=+5322.170643327" Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.566047 4939 generic.go:334] "Generic (PLEG): container finished" podID="4b01e44d-3e47-4d5a-9372-cda997282916" containerID="d4ebffa938d8b676cc2e352530bc3f2bd24bd79a295ac265382cc747f41d6033" exitCode=0 Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.566126 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a8b-account-create-update-rqz88" event={"ID":"4b01e44d-3e47-4d5a-9372-cda997282916","Type":"ContainerDied","Data":"d4ebffa938d8b676cc2e352530bc3f2bd24bd79a295ac265382cc747f41d6033"} Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.922989 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.942339 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts\") pod \"9bb72637-02c0-44b4-919a-8691529c611a\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.942426 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r5h6\" (UniqueName: \"kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6\") pod \"9bb72637-02c0-44b4-919a-8691529c611a\" (UID: \"9bb72637-02c0-44b4-919a-8691529c611a\") " Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.952128 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bb72637-02c0-44b4-919a-8691529c611a" (UID: "9bb72637-02c0-44b4-919a-8691529c611a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:58 crc kubenswrapper[4939]: I0318 17:05:58.952829 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6" (OuterVolumeSpecName: "kube-api-access-4r5h6") pod "9bb72637-02c0-44b4-919a-8691529c611a" (UID: "9bb72637-02c0-44b4-919a-8691529c611a"). InnerVolumeSpecName "kube-api-access-4r5h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.044063 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb72637-02c0-44b4-919a-8691529c611a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.044107 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r5h6\" (UniqueName: \"kubernetes.io/projected/9bb72637-02c0-44b4-919a-8691529c611a-kube-api-access-4r5h6\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.133379 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:05:59 crc kubenswrapper[4939]: E0318 17:05:59.133698 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.625168 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-d4wlf" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.627608 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-d4wlf" event={"ID":"9bb72637-02c0-44b4-919a-8691529c611a","Type":"ContainerDied","Data":"9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82"} Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.627693 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb979e301ae07975c5378feffbc905092fcfc409ac2d672a928b55dd30f3b82" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.930407 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.962962 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlt7n\" (UniqueName: \"kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n\") pod \"4b01e44d-3e47-4d5a-9372-cda997282916\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.963012 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts\") pod \"4b01e44d-3e47-4d5a-9372-cda997282916\" (UID: \"4b01e44d-3e47-4d5a-9372-cda997282916\") " Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.963872 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b01e44d-3e47-4d5a-9372-cda997282916" (UID: "4b01e44d-3e47-4d5a-9372-cda997282916"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:05:59 crc kubenswrapper[4939]: I0318 17:05:59.967526 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n" (OuterVolumeSpecName: "kube-api-access-wlt7n") pod "4b01e44d-3e47-4d5a-9372-cda997282916" (UID: "4b01e44d-3e47-4d5a-9372-cda997282916"). InnerVolumeSpecName "kube-api-access-wlt7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.065789 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlt7n\" (UniqueName: \"kubernetes.io/projected/4b01e44d-3e47-4d5a-9372-cda997282916-kube-api-access-wlt7n\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.065830 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b01e44d-3e47-4d5a-9372-cda997282916-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.134210 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564226-pfx2h"] Mar 18 17:06:00 crc kubenswrapper[4939]: E0318 17:06:00.134612 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb72637-02c0-44b4-919a-8691529c611a" containerName="mariadb-database-create" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.134630 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb72637-02c0-44b4-919a-8691529c611a" containerName="mariadb-database-create" Mar 18 17:06:00 crc kubenswrapper[4939]: E0318 17:06:00.134654 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b01e44d-3e47-4d5a-9372-cda997282916" containerName="mariadb-account-create-update" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.134663 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b01e44d-3e47-4d5a-9372-cda997282916" containerName="mariadb-account-create-update" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.134874 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb72637-02c0-44b4-919a-8691529c611a" containerName="mariadb-database-create" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.134900 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b01e44d-3e47-4d5a-9372-cda997282916" containerName="mariadb-account-create-update" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.135829 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.138022 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.139018 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.141852 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.155240 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-pfx2h"] Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.171370 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fbb\" (UniqueName: \"kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb\") pod \"auto-csr-approver-29564226-pfx2h\" (UID: \"bec9676a-ae7b-49f7-9edf-d1d8097ca445\") " pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.274169 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67fbb\" (UniqueName: \"kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb\") pod \"auto-csr-approver-29564226-pfx2h\" (UID: \"bec9676a-ae7b-49f7-9edf-d1d8097ca445\") " pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.293193 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67fbb\" (UniqueName: \"kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb\") pod \"auto-csr-approver-29564226-pfx2h\" (UID: \"bec9676a-ae7b-49f7-9edf-d1d8097ca445\") " pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.455532 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.619753 4939 scope.go:117] "RemoveContainer" containerID="a49ef056d058ce9b0b6dff21d06f4cc75f7214945b51137dae003a0c8dd8d56e" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.636064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a8b-account-create-update-rqz88" event={"ID":"4b01e44d-3e47-4d5a-9372-cda997282916","Type":"ContainerDied","Data":"639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e"} Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.636136 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639ba682c6fcc582fea67a885bd64cdcbbd1aaf586f47581a9d5858da090a64e" Mar 18 17:06:00 crc kubenswrapper[4939]: I0318 17:06:00.636241 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a8b-account-create-update-rqz88" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.075468 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-pfx2h"] Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.082808 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.645551 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" event={"ID":"bec9676a-ae7b-49f7-9edf-d1d8097ca445","Type":"ContainerStarted","Data":"a9f689ee3e8e4408f5be85fc1208ce595b15ddd5dc007a1937485915f20441bf"} Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.768444 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-g97xg"] Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.769867 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.771820 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.772793 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.772992 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.777960 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mhljb" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.783235 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g97xg"] Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.928225 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.928310 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:01 crc kubenswrapper[4939]: I0318 17:06:01.928381 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pn8\" (UniqueName: \"kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.030206 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.030312 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pn8\" (UniqueName: \"kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.030423 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.036322 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.036434 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.046811 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pn8\" (UniqueName: \"kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8\") pod \"keystone-db-sync-g97xg\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.135796 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.583948 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g97xg"] Mar 18 17:06:02 crc kubenswrapper[4939]: W0318 17:06:02.596161 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d05450_1c85_4e2a_980e_ae6abbc2fd42.slice/crio-9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5 WatchSource:0}: Error finding container 9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5: Status 404 returned error can't find the container with id 9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5 Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.655398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" event={"ID":"bec9676a-ae7b-49f7-9edf-d1d8097ca445","Type":"ContainerStarted","Data":"fb119c65e6691f9abdfd0382948b6f871e3393574cf4c4365f4e1be111d4bf99"} Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.658796 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g97xg" event={"ID":"29d05450-1c85-4e2a-980e-ae6abbc2fd42","Type":"ContainerStarted","Data":"9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5"} Mar 18 17:06:02 crc kubenswrapper[4939]: I0318 17:06:02.691879 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" podStartSLOduration=1.432582787 podStartE2EDuration="2.691856166s" podCreationTimestamp="2026-03-18 17:06:00 +0000 UTC" firstStartedPulling="2026-03-18 17:06:01.082411679 +0000 UTC m=+5325.681599300" lastFinishedPulling="2026-03-18 17:06:02.341685058 +0000 UTC m=+5326.940872679" observedRunningTime="2026-03-18 17:06:02.67576615 +0000 UTC m=+5327.274953771" watchObservedRunningTime="2026-03-18 17:06:02.691856166 +0000 UTC m=+5327.291043787" Mar 18 17:06:03 crc kubenswrapper[4939]: I0318 17:06:03.667563 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g97xg" event={"ID":"29d05450-1c85-4e2a-980e-ae6abbc2fd42","Type":"ContainerStarted","Data":"f6c3c0e5583adf4acfcef5b73e9ec4520dc7fc613f44f0f90425361a8bafe69a"} Mar 18 17:06:03 crc kubenswrapper[4939]: I0318 17:06:03.669355 4939 generic.go:334] "Generic (PLEG): container finished" podID="bec9676a-ae7b-49f7-9edf-d1d8097ca445" containerID="fb119c65e6691f9abdfd0382948b6f871e3393574cf4c4365f4e1be111d4bf99" exitCode=0 Mar 18 17:06:03 crc kubenswrapper[4939]: I0318 17:06:03.669385 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" event={"ID":"bec9676a-ae7b-49f7-9edf-d1d8097ca445","Type":"ContainerDied","Data":"fb119c65e6691f9abdfd0382948b6f871e3393574cf4c4365f4e1be111d4bf99"} Mar 18 17:06:03 crc kubenswrapper[4939]: I0318 17:06:03.689947 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-g97xg" podStartSLOduration=2.689927561 podStartE2EDuration="2.689927561s" podCreationTimestamp="2026-03-18 17:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:03.681111601 +0000 UTC m=+5328.280299232" watchObservedRunningTime="2026-03-18 17:06:03.689927561 +0000 UTC m=+5328.289115182" Mar 18 17:06:04 crc kubenswrapper[4939]: I0318 17:06:04.680445 4939 generic.go:334] "Generic (PLEG): container finished" podID="29d05450-1c85-4e2a-980e-ae6abbc2fd42" containerID="f6c3c0e5583adf4acfcef5b73e9ec4520dc7fc613f44f0f90425361a8bafe69a" exitCode=0 Mar 18 17:06:04 crc kubenswrapper[4939]: I0318 17:06:04.680526 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g97xg" event={"ID":"29d05450-1c85-4e2a-980e-ae6abbc2fd42","Type":"ContainerDied","Data":"f6c3c0e5583adf4acfcef5b73e9ec4520dc7fc613f44f0f90425361a8bafe69a"} Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.033678 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.090004 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67fbb\" (UniqueName: \"kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb\") pod \"bec9676a-ae7b-49f7-9edf-d1d8097ca445\" (UID: \"bec9676a-ae7b-49f7-9edf-d1d8097ca445\") " Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.096860 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb" (OuterVolumeSpecName: "kube-api-access-67fbb") pod "bec9676a-ae7b-49f7-9edf-d1d8097ca445" (UID: "bec9676a-ae7b-49f7-9edf-d1d8097ca445"). InnerVolumeSpecName "kube-api-access-67fbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.192834 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67fbb\" (UniqueName: \"kubernetes.io/projected/bec9676a-ae7b-49f7-9edf-d1d8097ca445-kube-api-access-67fbb\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.690687 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" event={"ID":"bec9676a-ae7b-49f7-9edf-d1d8097ca445","Type":"ContainerDied","Data":"a9f689ee3e8e4408f5be85fc1208ce595b15ddd5dc007a1937485915f20441bf"} Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.690748 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f689ee3e8e4408f5be85fc1208ce595b15ddd5dc007a1937485915f20441bf" Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.690700 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-pfx2h" Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.757394 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-l5rl6"] Mar 18 17:06:05 crc kubenswrapper[4939]: I0318 17:06:05.764123 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-l5rl6"] Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.019200 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.110275 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7pn8\" (UniqueName: \"kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8\") pod \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.110323 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data\") pod \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.110406 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle\") pod \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\" (UID: \"29d05450-1c85-4e2a-980e-ae6abbc2fd42\") " Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.117095 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8" (OuterVolumeSpecName: "kube-api-access-q7pn8") pod "29d05450-1c85-4e2a-980e-ae6abbc2fd42" (UID: "29d05450-1c85-4e2a-980e-ae6abbc2fd42"). InnerVolumeSpecName "kube-api-access-q7pn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.135619 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29d05450-1c85-4e2a-980e-ae6abbc2fd42" (UID: "29d05450-1c85-4e2a-980e-ae6abbc2fd42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.148023 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d6b9b7-dfb2-43a2-b271-521a40d3f8fd" path="/var/lib/kubelet/pods/92d6b9b7-dfb2-43a2-b271-521a40d3f8fd/volumes" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.157422 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data" (OuterVolumeSpecName: "config-data") pod "29d05450-1c85-4e2a-980e-ae6abbc2fd42" (UID: "29d05450-1c85-4e2a-980e-ae6abbc2fd42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.211845 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.211889 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7pn8\" (UniqueName: \"kubernetes.io/projected/29d05450-1c85-4e2a-980e-ae6abbc2fd42-kube-api-access-q7pn8\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.211900 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d05450-1c85-4e2a-980e-ae6abbc2fd42-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.699093 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g97xg" event={"ID":"29d05450-1c85-4e2a-980e-ae6abbc2fd42","Type":"ContainerDied","Data":"9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5"} Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.699144 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g97xg" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.699141 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bfc46b5e03b20e7a4ddf7be35b21bd695957c5af1486dae4fdb2ebab8c4b4d5" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.886772 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tvprb"] Mar 18 17:06:06 crc kubenswrapper[4939]: E0318 17:06:06.887205 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec9676a-ae7b-49f7-9edf-d1d8097ca445" containerName="oc" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.887231 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec9676a-ae7b-49f7-9edf-d1d8097ca445" containerName="oc" Mar 18 17:06:06 crc kubenswrapper[4939]: E0318 17:06:06.887259 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d05450-1c85-4e2a-980e-ae6abbc2fd42" containerName="keystone-db-sync" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.887268 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d05450-1c85-4e2a-980e-ae6abbc2fd42" containerName="keystone-db-sync" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.887465 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d05450-1c85-4e2a-980e-ae6abbc2fd42" containerName="keystone-db-sync" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.887493 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec9676a-ae7b-49f7-9edf-d1d8097ca445" containerName="oc" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.888102 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.898113 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.898135 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.898713 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mhljb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.898844 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.908204 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.908943 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.911245 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925088 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925179 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925228 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77kc\" (UniqueName: \"kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925264 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925309 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925361 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925466 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925524 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzrn\" (UniqueName: \"kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925561 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.925589 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.958556 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tvprb"] Mar 18 17:06:06 crc kubenswrapper[4939]: I0318 17:06:06.972564 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031099 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031182 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031247 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzrn\" (UniqueName: \"kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031294 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031318 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031396 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031469 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v77kc\" (UniqueName: \"kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031581 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031650 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.031770 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.032333 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.032470 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.032475 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.033022 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.035273 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.038455 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.042424 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.043093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.053397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.057693 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzrn\" (UniqueName: \"kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn\") pod \"keystone-bootstrap-tvprb\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.060085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77kc\" (UniqueName: \"kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc\") pod \"dnsmasq-dns-7c78d97f89-7m2l7\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.215158 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.245337 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.715064 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tvprb"] Mar 18 17:06:07 crc kubenswrapper[4939]: W0318 17:06:07.715226 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe83749_4886_4e24_a2ce_32d7985a522f.slice/crio-0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb WatchSource:0}: Error finding container 0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb: Status 404 returned error can't find the container with id 0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb Mar 18 17:06:07 crc kubenswrapper[4939]: I0318 17:06:07.829638 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:06:07 crc kubenswrapper[4939]: W0318 17:06:07.833102 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c9c59f0_a37f_4f2c_bcde_c5197897d8a5.slice/crio-57bd78511837b97261bff27d3b721248932ccd8e43f24e84926f974d82aaec16 WatchSource:0}: Error finding container 57bd78511837b97261bff27d3b721248932ccd8e43f24e84926f974d82aaec16: Status 404 returned error can't find the container with id 57bd78511837b97261bff27d3b721248932ccd8e43f24e84926f974d82aaec16 Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.715953 4939 generic.go:334] "Generic (PLEG): container finished" podID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerID="9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af" exitCode=0 Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.716068 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" event={"ID":"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5","Type":"ContainerDied","Data":"9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af"} Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.716578 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" event={"ID":"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5","Type":"ContainerStarted","Data":"57bd78511837b97261bff27d3b721248932ccd8e43f24e84926f974d82aaec16"} Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.718636 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvprb" event={"ID":"bbe83749-4886-4e24-a2ce-32d7985a522f","Type":"ContainerStarted","Data":"0e5d1e643048bffb2d44c4de9d7b8849146978a3e1d3724c105188668e7c84af"} Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.718659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvprb" event={"ID":"bbe83749-4886-4e24-a2ce-32d7985a522f","Type":"ContainerStarted","Data":"0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb"} Mar 18 17:06:08 crc kubenswrapper[4939]: I0318 17:06:08.798216 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tvprb" podStartSLOduration=2.7981991280000003 podStartE2EDuration="2.798199128s" podCreationTimestamp="2026-03-18 17:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:08.771947023 +0000 UTC m=+5333.371134644" watchObservedRunningTime="2026-03-18 17:06:08.798199128 +0000 UTC m=+5333.397386749" Mar 18 17:06:09 crc kubenswrapper[4939]: I0318 17:06:09.727623 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" event={"ID":"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5","Type":"ContainerStarted","Data":"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1"} Mar 18 17:06:09 crc kubenswrapper[4939]: I0318 17:06:09.755030 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" podStartSLOduration=3.755011493 podStartE2EDuration="3.755011493s" podCreationTimestamp="2026-03-18 17:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:09.753316275 +0000 UTC m=+5334.352503906" watchObservedRunningTime="2026-03-18 17:06:09.755011493 +0000 UTC m=+5334.354199114" Mar 18 17:06:10 crc kubenswrapper[4939]: I0318 17:06:10.735498 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:11 crc kubenswrapper[4939]: I0318 17:06:11.134084 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:06:11 crc kubenswrapper[4939]: E0318 17:06:11.134438 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:06:11 crc kubenswrapper[4939]: I0318 17:06:11.669831 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 17:06:11 crc kubenswrapper[4939]: I0318 17:06:11.744851 4939 generic.go:334] "Generic (PLEG): container finished" podID="bbe83749-4886-4e24-a2ce-32d7985a522f" containerID="0e5d1e643048bffb2d44c4de9d7b8849146978a3e1d3724c105188668e7c84af" exitCode=0 Mar 18 17:06:11 crc kubenswrapper[4939]: I0318 17:06:11.744938 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvprb" event={"ID":"bbe83749-4886-4e24-a2ce-32d7985a522f","Type":"ContainerDied","Data":"0e5d1e643048bffb2d44c4de9d7b8849146978a3e1d3724c105188668e7c84af"} Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.005230 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.007265 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.021933 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.065498 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.065622 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbt89\" (UniqueName: \"kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.065653 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.113559 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166551 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166668 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166709 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166749 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166792 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.166841 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mzrn\" (UniqueName: \"kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn\") pod \"bbe83749-4886-4e24-a2ce-32d7985a522f\" (UID: \"bbe83749-4886-4e24-a2ce-32d7985a522f\") " Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.167044 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.167136 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbt89\" (UniqueName: \"kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.167176 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.167747 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.168231 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.183836 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts" (OuterVolumeSpecName: "scripts") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.183864 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn" (OuterVolumeSpecName: "kube-api-access-9mzrn") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "kube-api-access-9mzrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.183897 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.183940 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.195563 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbt89\" (UniqueName: \"kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89\") pod \"certified-operators-4pjqz\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.195609 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.195770 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data" (OuterVolumeSpecName: "config-data") pod "bbe83749-4886-4e24-a2ce-32d7985a522f" (UID: "bbe83749-4886-4e24-a2ce-32d7985a522f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269127 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269168 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mzrn\" (UniqueName: \"kubernetes.io/projected/bbe83749-4886-4e24-a2ce-32d7985a522f-kube-api-access-9mzrn\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269181 4939 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269191 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269201 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.269209 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbe83749-4886-4e24-a2ce-32d7985a522f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.329373 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.767143 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvprb" event={"ID":"bbe83749-4886-4e24-a2ce-32d7985a522f","Type":"ContainerDied","Data":"0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb"} Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.767747 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b60ecf3804e1881cd3882b29d50725cedef21adf1f77d9d79a56481d83b58cb" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.767223 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvprb" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.795681 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:13 crc kubenswrapper[4939]: W0318 17:06:13.802038 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46a771a_1bd1_4216_ae29_3733fce0f2c8.slice/crio-2409299d0f5a0c3dd9e81818d169d5940c594ec877313460c3d37c59cafcb836 WatchSource:0}: Error finding container 2409299d0f5a0c3dd9e81818d169d5940c594ec877313460c3d37c59cafcb836: Status 404 returned error can't find the container with id 2409299d0f5a0c3dd9e81818d169d5940c594ec877313460c3d37c59cafcb836 Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.852009 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tvprb"] Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.862814 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tvprb"] Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.942572 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xwd4g"] Mar 18 17:06:13 crc kubenswrapper[4939]: E0318 17:06:13.943284 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe83749-4886-4e24-a2ce-32d7985a522f" containerName="keystone-bootstrap" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.943391 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe83749-4886-4e24-a2ce-32d7985a522f" containerName="keystone-bootstrap" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.943679 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe83749-4886-4e24-a2ce-32d7985a522f" containerName="keystone-bootstrap" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.944628 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.949860 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.950093 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.950102 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.950593 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.950704 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mhljb" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.963057 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xwd4g"] Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.984942 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.985038 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm55f\" (UniqueName: \"kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.985087 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.985128 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.985164 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:13 crc kubenswrapper[4939]: I0318 17:06:13.985203 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086629 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086710 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086759 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm55f\" (UniqueName: \"kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086789 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086824 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.086849 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.092432 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.092627 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.092689 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.093007 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.093137 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.103562 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm55f\" (UniqueName: \"kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f\") pod \"keystone-bootstrap-xwd4g\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.142170 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe83749-4886-4e24-a2ce-32d7985a522f" path="/var/lib/kubelet/pods/bbe83749-4886-4e24-a2ce-32d7985a522f/volumes" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.267663 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.692430 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xwd4g"] Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.778085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwd4g" event={"ID":"fec22beb-3747-46a2-8eec-a2cf8ac77a0f","Type":"ContainerStarted","Data":"8f13f8bfb9cd02a2ffcfbd6b7ee4169d411cc33e8cd9108d98f0af47c183efd9"} Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.780621 4939 generic.go:334] "Generic (PLEG): container finished" podID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerID="2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed" exitCode=0 Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.780701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerDied","Data":"2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed"} Mar 18 17:06:14 crc kubenswrapper[4939]: I0318 17:06:14.780731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerStarted","Data":"2409299d0f5a0c3dd9e81818d169d5940c594ec877313460c3d37c59cafcb836"} Mar 18 17:06:15 crc kubenswrapper[4939]: I0318 17:06:15.791751 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerStarted","Data":"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3"} Mar 18 17:06:15 crc kubenswrapper[4939]: I0318 17:06:15.793799 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwd4g" event={"ID":"fec22beb-3747-46a2-8eec-a2cf8ac77a0f","Type":"ContainerStarted","Data":"38f1e62a3d057a060dd9306d1e98c0ccb4ac938d075be71053596b0e2bfa4d93"} Mar 18 17:06:15 crc kubenswrapper[4939]: I0318 17:06:15.833873 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xwd4g" podStartSLOduration=2.8338538140000002 podStartE2EDuration="2.833853814s" podCreationTimestamp="2026-03-18 17:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:15.825358333 +0000 UTC m=+5340.424545954" watchObservedRunningTime="2026-03-18 17:06:15.833853814 +0000 UTC m=+5340.433041435" Mar 18 17:06:16 crc kubenswrapper[4939]: I0318 17:06:16.804711 4939 generic.go:334] "Generic (PLEG): container finished" podID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerID="063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3" exitCode=0 Mar 18 17:06:16 crc kubenswrapper[4939]: I0318 17:06:16.804811 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerDied","Data":"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3"} Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.247713 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.329919 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.330402 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="dnsmasq-dns" containerID="cri-o://15427c09b9481ce0a7cfb24176bb1863c7186e6fa3205aba6701446722978d93" gracePeriod=10 Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.455098 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.47:5353: connect: connection refused" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.817477 4939 generic.go:334] "Generic (PLEG): container finished" podID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerID="15427c09b9481ce0a7cfb24176bb1863c7186e6fa3205aba6701446722978d93" exitCode=0 Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.817912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" event={"ID":"819d0254-8d11-40e3-b54f-7a4f6b3c88b2","Type":"ContainerDied","Data":"15427c09b9481ce0a7cfb24176bb1863c7186e6fa3205aba6701446722978d93"} Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.818067 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" event={"ID":"819d0254-8d11-40e3-b54f-7a4f6b3c88b2","Type":"ContainerDied","Data":"5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4"} Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.818082 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5284ec3aecdf32ba126f92d9fabca20988e03fd45cbd362ebf91eff53b666cd4" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.820110 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerStarted","Data":"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b"} Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.839925 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.845835 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4pjqz" podStartSLOduration=3.362782053 podStartE2EDuration="5.845816464s" podCreationTimestamp="2026-03-18 17:06:12 +0000 UTC" firstStartedPulling="2026-03-18 17:06:14.785380977 +0000 UTC m=+5339.384568598" lastFinishedPulling="2026-03-18 17:06:17.268415388 +0000 UTC m=+5341.867603009" observedRunningTime="2026-03-18 17:06:17.844382213 +0000 UTC m=+5342.443569854" watchObservedRunningTime="2026-03-18 17:06:17.845816464 +0000 UTC m=+5342.445004085" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.867484 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc\") pod \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.867758 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb\") pod \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.867858 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config\") pod \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.867909 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb\") pod \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.867988 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxcks\" (UniqueName: \"kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks\") pod \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\" (UID: \"819d0254-8d11-40e3-b54f-7a4f6b3c88b2\") " Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.878466 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks" (OuterVolumeSpecName: "kube-api-access-lxcks") pod "819d0254-8d11-40e3-b54f-7a4f6b3c88b2" (UID: "819d0254-8d11-40e3-b54f-7a4f6b3c88b2"). InnerVolumeSpecName "kube-api-access-lxcks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.936991 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "819d0254-8d11-40e3-b54f-7a4f6b3c88b2" (UID: "819d0254-8d11-40e3-b54f-7a4f6b3c88b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.940081 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "819d0254-8d11-40e3-b54f-7a4f6b3c88b2" (UID: "819d0254-8d11-40e3-b54f-7a4f6b3c88b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.944204 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "819d0254-8d11-40e3-b54f-7a4f6b3c88b2" (UID: "819d0254-8d11-40e3-b54f-7a4f6b3c88b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.953990 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config" (OuterVolumeSpecName: "config") pod "819d0254-8d11-40e3-b54f-7a4f6b3c88b2" (UID: "819d0254-8d11-40e3-b54f-7a4f6b3c88b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.969680 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.969721 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.969730 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.969742 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxcks\" (UniqueName: \"kubernetes.io/projected/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-kube-api-access-lxcks\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:17 crc kubenswrapper[4939]: I0318 17:06:17.969754 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819d0254-8d11-40e3-b54f-7a4f6b3c88b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:18 crc kubenswrapper[4939]: I0318 17:06:18.830496 4939 generic.go:334] "Generic (PLEG): container finished" podID="fec22beb-3747-46a2-8eec-a2cf8ac77a0f" containerID="38f1e62a3d057a060dd9306d1e98c0ccb4ac938d075be71053596b0e2bfa4d93" exitCode=0 Mar 18 17:06:18 crc kubenswrapper[4939]: I0318 17:06:18.831143 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-f96p8" Mar 18 17:06:18 crc kubenswrapper[4939]: I0318 17:06:18.830592 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwd4g" event={"ID":"fec22beb-3747-46a2-8eec-a2cf8ac77a0f","Type":"ContainerDied","Data":"38f1e62a3d057a060dd9306d1e98c0ccb4ac938d075be71053596b0e2bfa4d93"} Mar 18 17:06:18 crc kubenswrapper[4939]: I0318 17:06:18.871861 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:06:18 crc kubenswrapper[4939]: I0318 17:06:18.880745 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-f96p8"] Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.143148 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" path="/var/lib/kubelet/pods/819d0254-8d11-40e3-b54f-7a4f6b3c88b2/volumes" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.246332 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311428 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311554 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311590 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311627 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311659 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm55f\" (UniqueName: \"kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.311779 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts\") pod \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\" (UID: \"fec22beb-3747-46a2-8eec-a2cf8ac77a0f\") " Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.318202 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.318674 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts" (OuterVolumeSpecName: "scripts") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.318799 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.339603 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f" (OuterVolumeSpecName: "kube-api-access-mm55f") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "kube-api-access-mm55f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.342104 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data" (OuterVolumeSpecName: "config-data") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.363881 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fec22beb-3747-46a2-8eec-a2cf8ac77a0f" (UID: "fec22beb-3747-46a2-8eec-a2cf8ac77a0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413787 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413830 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413844 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm55f\" (UniqueName: \"kubernetes.io/projected/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-kube-api-access-mm55f\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413855 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413863 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.413872 4939 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fec22beb-3747-46a2-8eec-a2cf8ac77a0f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.845998 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xwd4g" event={"ID":"fec22beb-3747-46a2-8eec-a2cf8ac77a0f","Type":"ContainerDied","Data":"8f13f8bfb9cd02a2ffcfbd6b7ee4169d411cc33e8cd9108d98f0af47c183efd9"} Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.846039 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f13f8bfb9cd02a2ffcfbd6b7ee4169d411cc33e8cd9108d98f0af47c183efd9" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.846073 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xwd4g" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.933962 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-859d47c774-sp5pm"] Mar 18 17:06:20 crc kubenswrapper[4939]: E0318 17:06:20.934337 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec22beb-3747-46a2-8eec-a2cf8ac77a0f" containerName="keystone-bootstrap" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.934359 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec22beb-3747-46a2-8eec-a2cf8ac77a0f" containerName="keystone-bootstrap" Mar 18 17:06:20 crc kubenswrapper[4939]: E0318 17:06:20.934385 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="dnsmasq-dns" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.934394 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="dnsmasq-dns" Mar 18 17:06:20 crc kubenswrapper[4939]: E0318 17:06:20.934409 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="init" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.934417 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="init" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.934615 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="819d0254-8d11-40e3-b54f-7a4f6b3c88b2" containerName="dnsmasq-dns" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.934636 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec22beb-3747-46a2-8eec-a2cf8ac77a0f" containerName="keystone-bootstrap" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.935251 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.937186 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.937757 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.938067 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.938470 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mhljb" Mar 18 17:06:20 crc kubenswrapper[4939]: I0318 17:06:20.956900 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-859d47c774-sp5pm"] Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023478 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-combined-ca-bundle\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023600 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-fernet-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023698 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-scripts\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023782 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-credential-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023852 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-config-data\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.023974 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gmg\" (UniqueName: \"kubernetes.io/projected/cac68db9-5855-4b55-81aa-124ad97bc6a5-kube-api-access-g2gmg\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125679 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-config-data\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125801 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gmg\" (UniqueName: \"kubernetes.io/projected/cac68db9-5855-4b55-81aa-124ad97bc6a5-kube-api-access-g2gmg\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125855 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-combined-ca-bundle\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125879 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-fernet-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125929 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-scripts\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.125979 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-credential-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.131209 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-combined-ca-bundle\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.134262 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-credential-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.136570 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-config-data\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.136847 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-scripts\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.137128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cac68db9-5855-4b55-81aa-124ad97bc6a5-fernet-keys\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.142494 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gmg\" (UniqueName: \"kubernetes.io/projected/cac68db9-5855-4b55-81aa-124ad97bc6a5-kube-api-access-g2gmg\") pod \"keystone-859d47c774-sp5pm\" (UID: \"cac68db9-5855-4b55-81aa-124ad97bc6a5\") " pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.255146 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:21 crc kubenswrapper[4939]: W0318 17:06:21.683513 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac68db9_5855_4b55_81aa_124ad97bc6a5.slice/crio-2a205340ace93a1193e8f31a5b50d929018bc55d41ba1112bd495dd281bded4d WatchSource:0}: Error finding container 2a205340ace93a1193e8f31a5b50d929018bc55d41ba1112bd495dd281bded4d: Status 404 returned error can't find the container with id 2a205340ace93a1193e8f31a5b50d929018bc55d41ba1112bd495dd281bded4d Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.689742 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-859d47c774-sp5pm"] Mar 18 17:06:21 crc kubenswrapper[4939]: I0318 17:06:21.855474 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-859d47c774-sp5pm" event={"ID":"cac68db9-5855-4b55-81aa-124ad97bc6a5","Type":"ContainerStarted","Data":"2a205340ace93a1193e8f31a5b50d929018bc55d41ba1112bd495dd281bded4d"} Mar 18 17:06:22 crc kubenswrapper[4939]: I0318 17:06:22.134104 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:06:22 crc kubenswrapper[4939]: E0318 17:06:22.134459 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:06:22 crc kubenswrapper[4939]: I0318 17:06:22.865517 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-859d47c774-sp5pm" event={"ID":"cac68db9-5855-4b55-81aa-124ad97bc6a5","Type":"ContainerStarted","Data":"7deed56fc4200fa322f4ce1738d73a982d1218e9583bc198a388c8abbb44305b"} Mar 18 17:06:22 crc kubenswrapper[4939]: I0318 17:06:22.865777 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:22 crc kubenswrapper[4939]: I0318 17:06:22.890245 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-859d47c774-sp5pm" podStartSLOduration=2.890211497 podStartE2EDuration="2.890211497s" podCreationTimestamp="2026-03-18 17:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:22.88433335 +0000 UTC m=+5347.483520991" watchObservedRunningTime="2026-03-18 17:06:22.890211497 +0000 UTC m=+5347.489399128" Mar 18 17:06:23 crc kubenswrapper[4939]: I0318 17:06:23.329694 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:23 crc kubenswrapper[4939]: I0318 17:06:23.330052 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:23 crc kubenswrapper[4939]: I0318 17:06:23.376964 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:23 crc kubenswrapper[4939]: I0318 17:06:23.909243 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:23 crc kubenswrapper[4939]: I0318 17:06:23.952867 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:25 crc kubenswrapper[4939]: I0318 17:06:25.889576 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4pjqz" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="registry-server" containerID="cri-o://fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b" gracePeriod=2 Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.343218 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.417632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbt89\" (UniqueName: \"kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89\") pod \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.417772 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities\") pod \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.418090 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content\") pod \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\" (UID: \"d46a771a-1bd1-4216-ae29-3733fce0f2c8\") " Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.420548 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities" (OuterVolumeSpecName: "utilities") pod "d46a771a-1bd1-4216-ae29-3733fce0f2c8" (UID: "d46a771a-1bd1-4216-ae29-3733fce0f2c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.426179 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89" (OuterVolumeSpecName: "kube-api-access-gbt89") pod "d46a771a-1bd1-4216-ae29-3733fce0f2c8" (UID: "d46a771a-1bd1-4216-ae29-3733fce0f2c8"). InnerVolumeSpecName "kube-api-access-gbt89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.520496 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.520543 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbt89\" (UniqueName: \"kubernetes.io/projected/d46a771a-1bd1-4216-ae29-3733fce0f2c8-kube-api-access-gbt89\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.900991 4939 generic.go:334] "Generic (PLEG): container finished" podID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerID="fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b" exitCode=0 Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.901098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerDied","Data":"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b"} Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.901139 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4pjqz" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.901976 4939 scope.go:117] "RemoveContainer" containerID="fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.902137 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4pjqz" event={"ID":"d46a771a-1bd1-4216-ae29-3733fce0f2c8","Type":"ContainerDied","Data":"2409299d0f5a0c3dd9e81818d169d5940c594ec877313460c3d37c59cafcb836"} Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.925780 4939 scope.go:117] "RemoveContainer" containerID="063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.942311 4939 scope.go:117] "RemoveContainer" containerID="2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.979306 4939 scope.go:117] "RemoveContainer" containerID="fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b" Mar 18 17:06:26 crc kubenswrapper[4939]: E0318 17:06:26.980249 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b\": container with ID starting with fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b not found: ID does not exist" containerID="fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.980317 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b"} err="failed to get container status \"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b\": rpc error: code = NotFound desc = could not find container \"fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b\": container with ID starting with fb7eaea6a7090e5033e0b3d47574b30f91435b5bdabc6f00343e9a290cb8a67b not found: ID does not exist" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.980356 4939 scope.go:117] "RemoveContainer" containerID="063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3" Mar 18 17:06:26 crc kubenswrapper[4939]: E0318 17:06:26.981092 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3\": container with ID starting with 063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3 not found: ID does not exist" containerID="063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.981189 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3"} err="failed to get container status \"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3\": rpc error: code = NotFound desc = could not find container \"063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3\": container with ID starting with 063005bf80d838e529729f796e85106b113aa87a7793b11ccee53fd1840684a3 not found: ID does not exist" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.981250 4939 scope.go:117] "RemoveContainer" containerID="2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed" Mar 18 17:06:26 crc kubenswrapper[4939]: E0318 17:06:26.981763 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed\": container with ID starting with 2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed not found: ID does not exist" containerID="2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.981797 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed"} err="failed to get container status \"2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed\": rpc error: code = NotFound desc = could not find container \"2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed\": container with ID starting with 2bb60f0a7ab7acd7bb0d8cbfc2fc044c8e7274833fa65eddb69a26329a7bbeed not found: ID does not exist" Mar 18 17:06:26 crc kubenswrapper[4939]: I0318 17:06:26.988672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d46a771a-1bd1-4216-ae29-3733fce0f2c8" (UID: "d46a771a-1bd1-4216-ae29-3733fce0f2c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:06:27 crc kubenswrapper[4939]: I0318 17:06:27.028619 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46a771a-1bd1-4216-ae29-3733fce0f2c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:27 crc kubenswrapper[4939]: I0318 17:06:27.238262 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:27 crc kubenswrapper[4939]: I0318 17:06:27.246588 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4pjqz"] Mar 18 17:06:28 crc kubenswrapper[4939]: I0318 17:06:28.144139 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" path="/var/lib/kubelet/pods/d46a771a-1bd1-4216-ae29-3733fce0f2c8/volumes" Mar 18 17:06:33 crc kubenswrapper[4939]: I0318 17:06:33.133327 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:06:33 crc kubenswrapper[4939]: E0318 17:06:33.134090 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:06:47 crc kubenswrapper[4939]: I0318 17:06:47.133167 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:06:47 crc kubenswrapper[4939]: E0318 17:06:47.133860 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:06:52 crc kubenswrapper[4939]: I0318 17:06:52.780592 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-859d47c774-sp5pm" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.242563 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 17:06:56 crc kubenswrapper[4939]: E0318 17:06:56.243365 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="extract-content" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.243380 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="extract-content" Mar 18 17:06:56 crc kubenswrapper[4939]: E0318 17:06:56.243399 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="registry-server" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.243408 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="registry-server" Mar 18 17:06:56 crc kubenswrapper[4939]: E0318 17:06:56.243427 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="extract-utilities" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.243435 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="extract-utilities" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.243705 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46a771a-1bd1-4216-ae29-3733fce0f2c8" containerName="registry-server" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.245125 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.247476 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.247539 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.249081 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fbrrm" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.260158 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.422662 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.422728 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr4dt\" (UniqueName: \"kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.422959 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.525161 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.525531 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr4dt\" (UniqueName: \"kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.525699 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.526224 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.532534 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.541555 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr4dt\" (UniqueName: \"kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt\") pod \"openstackclient\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " pod="openstack/openstackclient" Mar 18 17:06:56 crc kubenswrapper[4939]: I0318 17:06:56.569788 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:06:57 crc kubenswrapper[4939]: I0318 17:06:57.009322 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 17:06:57 crc kubenswrapper[4939]: I0318 17:06:57.157013 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"53b9939b-2d68-4c16-9e00-bed4c66cd226","Type":"ContainerStarted","Data":"b15fd2586aef7e1a981ff5a6554785fb7e2446c7b0b30ac70c5cdd15ab18447c"} Mar 18 17:06:58 crc kubenswrapper[4939]: I0318 17:06:58.166740 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"53b9939b-2d68-4c16-9e00-bed4c66cd226","Type":"ContainerStarted","Data":"75dafdb295f06a9a5f3e40734c13a7b852f093af6935ef326d534be4eb5dac68"} Mar 18 17:06:58 crc kubenswrapper[4939]: I0318 17:06:58.194428 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.194401032 podStartE2EDuration="2.194401032s" podCreationTimestamp="2026-03-18 17:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:58.185316454 +0000 UTC m=+5382.784504095" watchObservedRunningTime="2026-03-18 17:06:58.194401032 +0000 UTC m=+5382.793588673" Mar 18 17:06:59 crc kubenswrapper[4939]: I0318 17:06:59.133747 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:06:59 crc kubenswrapper[4939]: E0318 17:06:59.134359 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:07:00 crc kubenswrapper[4939]: I0318 17:07:00.719663 4939 scope.go:117] "RemoveContainer" containerID="cbfe860e6801aab6e2503bd8f035f75bd2b1dbd53d5fa2dada71103ad8a25066" Mar 18 17:07:14 crc kubenswrapper[4939]: I0318 17:07:14.134346 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:07:14 crc kubenswrapper[4939]: E0318 17:07:14.135314 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:07:25 crc kubenswrapper[4939]: I0318 17:07:25.133580 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:07:25 crc kubenswrapper[4939]: E0318 17:07:25.135411 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:07:36 crc kubenswrapper[4939]: I0318 17:07:36.140746 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:07:36 crc kubenswrapper[4939]: E0318 17:07:36.144034 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:07:42 crc kubenswrapper[4939]: I0318 17:07:42.996422 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:42 crc kubenswrapper[4939]: I0318 17:07:42.999090 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.005369 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.143024 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkjs\" (UniqueName: \"kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.143376 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.143548 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.245593 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkjs\" (UniqueName: \"kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.245947 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.246116 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.246573 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.246607 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.266708 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkjs\" (UniqueName: \"kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs\") pod \"community-operators-xlqtz\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.324767 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:43 crc kubenswrapper[4939]: I0318 17:07:43.870931 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:44 crc kubenswrapper[4939]: I0318 17:07:44.569228 4939 generic.go:334] "Generic (PLEG): container finished" podID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerID="33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c" exitCode=0 Mar 18 17:07:44 crc kubenswrapper[4939]: I0318 17:07:44.569306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerDied","Data":"33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c"} Mar 18 17:07:44 crc kubenswrapper[4939]: I0318 17:07:44.572474 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerStarted","Data":"4cbf7f93bc19bcb63e16704cc4f35be8321d4d2ecb2385a56609f95a1844368b"} Mar 18 17:07:45 crc kubenswrapper[4939]: I0318 17:07:45.584451 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerStarted","Data":"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d"} Mar 18 17:07:46 crc kubenswrapper[4939]: I0318 17:07:46.596838 4939 generic.go:334] "Generic (PLEG): container finished" podID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerID="610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d" exitCode=0 Mar 18 17:07:46 crc kubenswrapper[4939]: I0318 17:07:46.596930 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerDied","Data":"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d"} Mar 18 17:07:47 crc kubenswrapper[4939]: I0318 17:07:47.612430 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerStarted","Data":"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd"} Mar 18 17:07:47 crc kubenswrapper[4939]: I0318 17:07:47.642813 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xlqtz" podStartSLOduration=3.230788733 podStartE2EDuration="5.642791637s" podCreationTimestamp="2026-03-18 17:07:42 +0000 UTC" firstStartedPulling="2026-03-18 17:07:44.571105461 +0000 UTC m=+5429.170293082" lastFinishedPulling="2026-03-18 17:07:46.983108365 +0000 UTC m=+5431.582295986" observedRunningTime="2026-03-18 17:07:47.631155046 +0000 UTC m=+5432.230342667" watchObservedRunningTime="2026-03-18 17:07:47.642791637 +0000 UTC m=+5432.241979258" Mar 18 17:07:49 crc kubenswrapper[4939]: I0318 17:07:49.134028 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:07:49 crc kubenswrapper[4939]: E0318 17:07:49.134272 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:07:53 crc kubenswrapper[4939]: I0318 17:07:53.325396 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:53 crc kubenswrapper[4939]: I0318 17:07:53.326116 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:53 crc kubenswrapper[4939]: I0318 17:07:53.400299 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:53 crc kubenswrapper[4939]: I0318 17:07:53.701159 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:53 crc kubenswrapper[4939]: I0318 17:07:53.752549 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:55 crc kubenswrapper[4939]: I0318 17:07:55.691119 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xlqtz" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="registry-server" containerID="cri-o://7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd" gracePeriod=2 Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.137405 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.284261 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities\") pod \"f6d25610-a84a-40f1-bfc2-6820c293a688\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.284620 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zkjs\" (UniqueName: \"kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs\") pod \"f6d25610-a84a-40f1-bfc2-6820c293a688\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.284650 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content\") pod \"f6d25610-a84a-40f1-bfc2-6820c293a688\" (UID: \"f6d25610-a84a-40f1-bfc2-6820c293a688\") " Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.287389 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities" (OuterVolumeSpecName: "utilities") pod "f6d25610-a84a-40f1-bfc2-6820c293a688" (UID: "f6d25610-a84a-40f1-bfc2-6820c293a688"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.294844 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs" (OuterVolumeSpecName: "kube-api-access-9zkjs") pod "f6d25610-a84a-40f1-bfc2-6820c293a688" (UID: "f6d25610-a84a-40f1-bfc2-6820c293a688"). InnerVolumeSpecName "kube-api-access-9zkjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.386670 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.386710 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zkjs\" (UniqueName: \"kubernetes.io/projected/f6d25610-a84a-40f1-bfc2-6820c293a688-kube-api-access-9zkjs\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.472883 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6d25610-a84a-40f1-bfc2-6820c293a688" (UID: "f6d25610-a84a-40f1-bfc2-6820c293a688"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.488216 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d25610-a84a-40f1-bfc2-6820c293a688-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.701963 4939 generic.go:334] "Generic (PLEG): container finished" podID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerID="7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd" exitCode=0 Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.702026 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerDied","Data":"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd"} Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.702084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlqtz" event={"ID":"f6d25610-a84a-40f1-bfc2-6820c293a688","Type":"ContainerDied","Data":"4cbf7f93bc19bcb63e16704cc4f35be8321d4d2ecb2385a56609f95a1844368b"} Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.702104 4939 scope.go:117] "RemoveContainer" containerID="7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.702039 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlqtz" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.722030 4939 scope.go:117] "RemoveContainer" containerID="610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.733495 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.740074 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xlqtz"] Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.750663 4939 scope.go:117] "RemoveContainer" containerID="33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.780929 4939 scope.go:117] "RemoveContainer" containerID="7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd" Mar 18 17:07:56 crc kubenswrapper[4939]: E0318 17:07:56.781421 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd\": container with ID starting with 7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd not found: ID does not exist" containerID="7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.781478 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd"} err="failed to get container status \"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd\": rpc error: code = NotFound desc = could not find container \"7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd\": container with ID starting with 7d9de49ebe8654e3e192e73c9f78d7de34e67a5785220e736cc1b101f1911dcd not found: ID does not exist" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.781657 4939 scope.go:117] "RemoveContainer" containerID="610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d" Mar 18 17:07:56 crc kubenswrapper[4939]: E0318 17:07:56.782230 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d\": container with ID starting with 610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d not found: ID does not exist" containerID="610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.782278 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d"} err="failed to get container status \"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d\": rpc error: code = NotFound desc = could not find container \"610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d\": container with ID starting with 610f8400f0a2493095719b8f95f20d9a8df63148247e4e75d449ab218c2b129d not found: ID does not exist" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.782308 4939 scope.go:117] "RemoveContainer" containerID="33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c" Mar 18 17:07:56 crc kubenswrapper[4939]: E0318 17:07:56.783354 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c\": container with ID starting with 33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c not found: ID does not exist" containerID="33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c" Mar 18 17:07:56 crc kubenswrapper[4939]: I0318 17:07:56.783396 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c"} err="failed to get container status \"33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c\": rpc error: code = NotFound desc = could not find container \"33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c\": container with ID starting with 33ea4490f8a8883e7a7ea4b1d2e2f467f08ca952935e4c84a62db5d185756f7c not found: ID does not exist" Mar 18 17:07:58 crc kubenswrapper[4939]: I0318 17:07:58.143046 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" path="/var/lib/kubelet/pods/f6d25610-a84a-40f1-bfc2-6820c293a688/volumes" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.133972 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:08:00 crc kubenswrapper[4939]: E0318 17:08:00.134461 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.152002 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564228-mf8v6"] Mar 18 17:08:00 crc kubenswrapper[4939]: E0318 17:08:00.152346 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="extract-content" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.152362 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="extract-content" Mar 18 17:08:00 crc kubenswrapper[4939]: E0318 17:08:00.152375 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="extract-utilities" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.152382 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="extract-utilities" Mar 18 17:08:00 crc kubenswrapper[4939]: E0318 17:08:00.152400 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.152405 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.152589 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d25610-a84a-40f1-bfc2-6820c293a688" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.153106 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.162072 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.172918 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.173196 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-mf8v6"] Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.173896 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.250229 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vw2\" (UniqueName: \"kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2\") pod \"auto-csr-approver-29564228-mf8v6\" (UID: \"7f275e3e-f0bb-429f-beb3-193350cd1916\") " pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.351442 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vw2\" (UniqueName: \"kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2\") pod \"auto-csr-approver-29564228-mf8v6\" (UID: \"7f275e3e-f0bb-429f-beb3-193350cd1916\") " pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.370602 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vw2\" (UniqueName: \"kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2\") pod \"auto-csr-approver-29564228-mf8v6\" (UID: \"7f275e3e-f0bb-429f-beb3-193350cd1916\") " pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:00 crc kubenswrapper[4939]: I0318 17:08:00.527517 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:01 crc kubenswrapper[4939]: I0318 17:08:01.011146 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-mf8v6"] Mar 18 17:08:01 crc kubenswrapper[4939]: W0318 17:08:01.022628 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f275e3e_f0bb_429f_beb3_193350cd1916.slice/crio-9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653 WatchSource:0}: Error finding container 9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653: Status 404 returned error can't find the container with id 9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653 Mar 18 17:08:01 crc kubenswrapper[4939]: I0318 17:08:01.747021 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" event={"ID":"7f275e3e-f0bb-429f-beb3-193350cd1916","Type":"ContainerStarted","Data":"9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653"} Mar 18 17:08:02 crc kubenswrapper[4939]: I0318 17:08:02.754846 4939 generic.go:334] "Generic (PLEG): container finished" podID="7f275e3e-f0bb-429f-beb3-193350cd1916" containerID="261ccdb1cdd00a4044764132044a0d1f6ce6af5db010351c9ec5f8b14a5b527a" exitCode=0 Mar 18 17:08:02 crc kubenswrapper[4939]: I0318 17:08:02.754977 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" event={"ID":"7f275e3e-f0bb-429f-beb3-193350cd1916","Type":"ContainerDied","Data":"261ccdb1cdd00a4044764132044a0d1f6ce6af5db010351c9ec5f8b14a5b527a"} Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.109425 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.243285 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8vw2\" (UniqueName: \"kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2\") pod \"7f275e3e-f0bb-429f-beb3-193350cd1916\" (UID: \"7f275e3e-f0bb-429f-beb3-193350cd1916\") " Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.250427 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2" (OuterVolumeSpecName: "kube-api-access-h8vw2") pod "7f275e3e-f0bb-429f-beb3-193350cd1916" (UID: "7f275e3e-f0bb-429f-beb3-193350cd1916"). InnerVolumeSpecName "kube-api-access-h8vw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.345921 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8vw2\" (UniqueName: \"kubernetes.io/projected/7f275e3e-f0bb-429f-beb3-193350cd1916-kube-api-access-h8vw2\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.772909 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" event={"ID":"7f275e3e-f0bb-429f-beb3-193350cd1916","Type":"ContainerDied","Data":"9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653"} Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.772954 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-mf8v6" Mar 18 17:08:04 crc kubenswrapper[4939]: I0318 17:08:04.772963 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9224c0ad0563266cd1fb2bb4db91f293cb70eccc22c6b8eda1b84051df76f653" Mar 18 17:08:05 crc kubenswrapper[4939]: I0318 17:08:05.187153 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-bg4gf"] Mar 18 17:08:05 crc kubenswrapper[4939]: I0318 17:08:05.200126 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-bg4gf"] Mar 18 17:08:06 crc kubenswrapper[4939]: I0318 17:08:06.146957 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b85d9ae-d037-4350-891f-679d3c14976c" path="/var/lib/kubelet/pods/2b85d9ae-d037-4350-891f-679d3c14976c/volumes" Mar 18 17:08:13 crc kubenswrapper[4939]: I0318 17:08:13.133774 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:08:13 crc kubenswrapper[4939]: E0318 17:08:13.134479 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:08:27 crc kubenswrapper[4939]: I0318 17:08:27.133725 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:08:27 crc kubenswrapper[4939]: E0318 17:08:27.134514 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:08:39 crc kubenswrapper[4939]: I0318 17:08:39.968963 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-257dw"] Mar 18 17:08:39 crc kubenswrapper[4939]: E0318 17:08:39.969842 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f275e3e-f0bb-429f-beb3-193350cd1916" containerName="oc" Mar 18 17:08:39 crc kubenswrapper[4939]: I0318 17:08:39.969859 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f275e3e-f0bb-429f-beb3-193350cd1916" containerName="oc" Mar 18 17:08:39 crc kubenswrapper[4939]: I0318 17:08:39.970090 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f275e3e-f0bb-429f-beb3-193350cd1916" containerName="oc" Mar 18 17:08:39 crc kubenswrapper[4939]: I0318 17:08:39.970795 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-257dw" Mar 18 17:08:39 crc kubenswrapper[4939]: I0318 17:08:39.982371 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-257dw"] Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.072456 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.073226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6kmb\" (UniqueName: \"kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.075688 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c3ac-account-create-update-jctwd"] Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.078481 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.080479 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.095024 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c3ac-account-create-update-jctwd"] Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.133927 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:08:40 crc kubenswrapper[4939]: E0318 17:08:40.134207 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.174747 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.174918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6kmb\" (UniqueName: \"kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.174972 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.175022 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgqf9\" (UniqueName: \"kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.176173 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.194425 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6kmb\" (UniqueName: \"kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb\") pod \"barbican-db-create-257dw\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.276180 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.276442 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgqf9\" (UniqueName: \"kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.277348 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.294344 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-257dw" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.312715 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgqf9\" (UniqueName: \"kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9\") pod \"barbican-c3ac-account-create-update-jctwd\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.400734 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.746965 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-257dw"] Mar 18 17:08:40 crc kubenswrapper[4939]: I0318 17:08:40.855817 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c3ac-account-create-update-jctwd"] Mar 18 17:08:40 crc kubenswrapper[4939]: W0318 17:08:40.856226 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64bcac5_11b3_4fe1_b9ae_fea7819cc2b4.slice/crio-8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29 WatchSource:0}: Error finding container 8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29: Status 404 returned error can't find the container with id 8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29 Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.070976 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ac-account-create-update-jctwd" event={"ID":"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4","Type":"ContainerStarted","Data":"8f7cd5378cd539f6ccc49a27282339c7344ba8f8254632a2ea2a085e95e0f982"} Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.071365 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ac-account-create-update-jctwd" event={"ID":"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4","Type":"ContainerStarted","Data":"8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29"} Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.072754 4939 generic.go:334] "Generic (PLEG): container finished" podID="4ae14a4f-992c-45b1-aff8-d60663244ecb" containerID="3ead99fefcc65a8caa4e8e74118e63dbb7334590a98616cf69ceccc671e2cb74" exitCode=0 Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.072796 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-257dw" event={"ID":"4ae14a4f-992c-45b1-aff8-d60663244ecb","Type":"ContainerDied","Data":"3ead99fefcc65a8caa4e8e74118e63dbb7334590a98616cf69ceccc671e2cb74"} Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.072820 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-257dw" event={"ID":"4ae14a4f-992c-45b1-aff8-d60663244ecb","Type":"ContainerStarted","Data":"8e18522ddeaa8e7172f7cf6e130645fd4ae92235c55a2d23d1293e84a54a0f72"} Mar 18 17:08:41 crc kubenswrapper[4939]: I0318 17:08:41.094139 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-c3ac-account-create-update-jctwd" podStartSLOduration=1.0941132709999999 podStartE2EDuration="1.094113271s" podCreationTimestamp="2026-03-18 17:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:41.088579034 +0000 UTC m=+5485.687766685" watchObservedRunningTime="2026-03-18 17:08:41.094113271 +0000 UTC m=+5485.693300892" Mar 18 17:08:41 crc kubenswrapper[4939]: E0318 17:08:41.217829 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ae14a4f_992c_45b1_aff8_d60663244ecb.slice/crio-3ead99fefcc65a8caa4e8e74118e63dbb7334590a98616cf69ceccc671e2cb74.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ae14a4f_992c_45b1_aff8_d60663244ecb.slice/crio-conmon-3ead99fefcc65a8caa4e8e74118e63dbb7334590a98616cf69ceccc671e2cb74.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.081950 4939 generic.go:334] "Generic (PLEG): container finished" podID="a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" containerID="8f7cd5378cd539f6ccc49a27282339c7344ba8f8254632a2ea2a085e95e0f982" exitCode=0 Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.082069 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ac-account-create-update-jctwd" event={"ID":"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4","Type":"ContainerDied","Data":"8f7cd5378cd539f6ccc49a27282339c7344ba8f8254632a2ea2a085e95e0f982"} Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.376122 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-257dw" Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.516142 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts\") pod \"4ae14a4f-992c-45b1-aff8-d60663244ecb\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.516337 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6kmb\" (UniqueName: \"kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb\") pod \"4ae14a4f-992c-45b1-aff8-d60663244ecb\" (UID: \"4ae14a4f-992c-45b1-aff8-d60663244ecb\") " Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.517339 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ae14a4f-992c-45b1-aff8-d60663244ecb" (UID: "4ae14a4f-992c-45b1-aff8-d60663244ecb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.524803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb" (OuterVolumeSpecName: "kube-api-access-n6kmb") pod "4ae14a4f-992c-45b1-aff8-d60663244ecb" (UID: "4ae14a4f-992c-45b1-aff8-d60663244ecb"). InnerVolumeSpecName "kube-api-access-n6kmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.617882 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6kmb\" (UniqueName: \"kubernetes.io/projected/4ae14a4f-992c-45b1-aff8-d60663244ecb-kube-api-access-n6kmb\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:42 crc kubenswrapper[4939]: I0318 17:08:42.617929 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ae14a4f-992c-45b1-aff8-d60663244ecb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.094228 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-257dw" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.094223 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-257dw" event={"ID":"4ae14a4f-992c-45b1-aff8-d60663244ecb","Type":"ContainerDied","Data":"8e18522ddeaa8e7172f7cf6e130645fd4ae92235c55a2d23d1293e84a54a0f72"} Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.096850 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e18522ddeaa8e7172f7cf6e130645fd4ae92235c55a2d23d1293e84a54a0f72" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.376999 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.533650 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts\") pod \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.533808 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgqf9\" (UniqueName: \"kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9\") pod \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\" (UID: \"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4\") " Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.534790 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" (UID: "a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.547814 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9" (OuterVolumeSpecName: "kube-api-access-xgqf9") pod "a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" (UID: "a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4"). InnerVolumeSpecName "kube-api-access-xgqf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.636005 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgqf9\" (UniqueName: \"kubernetes.io/projected/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-kube-api-access-xgqf9\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:43 crc kubenswrapper[4939]: I0318 17:08:43.636050 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:44 crc kubenswrapper[4939]: I0318 17:08:44.104856 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ac-account-create-update-jctwd" event={"ID":"a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4","Type":"ContainerDied","Data":"8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29"} Mar 18 17:08:44 crc kubenswrapper[4939]: I0318 17:08:44.105195 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d24ea4a586986c352928f5cbde7423b789332384770b2c98e4a56dd58d68a29" Mar 18 17:08:44 crc kubenswrapper[4939]: I0318 17:08:44.105264 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ac-account-create-update-jctwd" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.322208 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5p47h"] Mar 18 17:08:45 crc kubenswrapper[4939]: E0318 17:08:45.322717 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" containerName="mariadb-account-create-update" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.322736 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" containerName="mariadb-account-create-update" Mar 18 17:08:45 crc kubenswrapper[4939]: E0318 17:08:45.322780 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae14a4f-992c-45b1-aff8-d60663244ecb" containerName="mariadb-database-create" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.322788 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae14a4f-992c-45b1-aff8-d60663244ecb" containerName="mariadb-database-create" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.323023 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" containerName="mariadb-account-create-update" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.323058 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae14a4f-992c-45b1-aff8-d60663244ecb" containerName="mariadb-database-create" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.323849 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.329301 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.329705 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wj2z5" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.353649 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5p47h"] Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.462312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkncp\" (UniqueName: \"kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.462391 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.462448 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.564244 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkncp\" (UniqueName: \"kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.564331 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.564383 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.570708 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.582276 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkncp\" (UniqueName: \"kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.596195 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data\") pod \"barbican-db-sync-5p47h\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.655742 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:45 crc kubenswrapper[4939]: W0318 17:08:45.904866 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90dc96a_1ec7_430a_b6c6_520cf884cdd0.slice/crio-68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa WatchSource:0}: Error finding container 68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa: Status 404 returned error can't find the container with id 68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa Mar 18 17:08:45 crc kubenswrapper[4939]: I0318 17:08:45.905915 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5p47h"] Mar 18 17:08:46 crc kubenswrapper[4939]: I0318 17:08:46.122283 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5p47h" event={"ID":"a90dc96a-1ec7-430a-b6c6-520cf884cdd0","Type":"ContainerStarted","Data":"7022bc45decbddcb0bbf5a6a605822340e2df53468e76a16886d61569553592c"} Mar 18 17:08:46 crc kubenswrapper[4939]: I0318 17:08:46.122619 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5p47h" event={"ID":"a90dc96a-1ec7-430a-b6c6-520cf884cdd0","Type":"ContainerStarted","Data":"68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa"} Mar 18 17:08:46 crc kubenswrapper[4939]: I0318 17:08:46.142735 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5p47h" podStartSLOduration=1.14271009 podStartE2EDuration="1.14271009s" podCreationTimestamp="2026-03-18 17:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:46.136452592 +0000 UTC m=+5490.735640213" watchObservedRunningTime="2026-03-18 17:08:46.14271009 +0000 UTC m=+5490.741897711" Mar 18 17:08:48 crc kubenswrapper[4939]: I0318 17:08:48.141406 4939 generic.go:334] "Generic (PLEG): container finished" podID="a90dc96a-1ec7-430a-b6c6-520cf884cdd0" containerID="7022bc45decbddcb0bbf5a6a605822340e2df53468e76a16886d61569553592c" exitCode=0 Mar 18 17:08:48 crc kubenswrapper[4939]: I0318 17:08:48.144794 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5p47h" event={"ID":"a90dc96a-1ec7-430a-b6c6-520cf884cdd0","Type":"ContainerDied","Data":"7022bc45decbddcb0bbf5a6a605822340e2df53468e76a16886d61569553592c"} Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.436554 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.530191 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data\") pod \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.530348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkncp\" (UniqueName: \"kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp\") pod \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.530489 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle\") pod \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\" (UID: \"a90dc96a-1ec7-430a-b6c6-520cf884cdd0\") " Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.536990 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a90dc96a-1ec7-430a-b6c6-520cf884cdd0" (UID: "a90dc96a-1ec7-430a-b6c6-520cf884cdd0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.537028 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp" (OuterVolumeSpecName: "kube-api-access-qkncp") pod "a90dc96a-1ec7-430a-b6c6-520cf884cdd0" (UID: "a90dc96a-1ec7-430a-b6c6-520cf884cdd0"). InnerVolumeSpecName "kube-api-access-qkncp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.553554 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a90dc96a-1ec7-430a-b6c6-520cf884cdd0" (UID: "a90dc96a-1ec7-430a-b6c6-520cf884cdd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.632449 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.632487 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkncp\" (UniqueName: \"kubernetes.io/projected/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-kube-api-access-qkncp\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:49 crc kubenswrapper[4939]: I0318 17:08:49.632498 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a90dc96a-1ec7-430a-b6c6-520cf884cdd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.162652 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5p47h" event={"ID":"a90dc96a-1ec7-430a-b6c6-520cf884cdd0","Type":"ContainerDied","Data":"68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa"} Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.162885 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d79e38a38d98dcefc6547955854e7622aacef0688d672c9ccc828dfa9480fa" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.162935 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5p47h" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.403040 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5cd5655f5f-5w68p"] Mar 18 17:08:50 crc kubenswrapper[4939]: E0318 17:08:50.403546 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90dc96a-1ec7-430a-b6c6-520cf884cdd0" containerName="barbican-db-sync" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.403568 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90dc96a-1ec7-430a-b6c6-520cf884cdd0" containerName="barbican-db-sync" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.404111 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90dc96a-1ec7-430a-b6c6-520cf884cdd0" containerName="barbican-db-sync" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.406672 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.419351 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.419727 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wj2z5" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.419941 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.422414 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-68c6d787b6-lttfq"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.423763 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.425440 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.441451 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5cd5655f5f-5w68p"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.460453 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68c6d787b6-lttfq"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.501187 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.502697 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.531381 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.552865 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e322cd3d-c091-4958-92fd-65512870096f-logs\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.552918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-combined-ca-bundle\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.552949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data-custom\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.552980 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjjqh\" (UniqueName: \"kubernetes.io/projected/e322cd3d-c091-4958-92fd-65512870096f-kube-api-access-zjjqh\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553012 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553028 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-combined-ca-bundle\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553075 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c95d52-2520-4a9f-b32b-9023caca3572-logs\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553116 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpv95\" (UniqueName: \"kubernetes.io/projected/d5c95d52-2520-4a9f-b32b-9023caca3572-kube-api-access-mpv95\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553133 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data-custom\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.553149 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.583091 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7447b48946-z557b"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.584404 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.586899 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.613571 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7447b48946-z557b"] Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655650 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjjqh\" (UniqueName: \"kubernetes.io/projected/e322cd3d-c091-4958-92fd-65512870096f-kube-api-access-zjjqh\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655701 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data-custom\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655739 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-combined-ca-bundle\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655762 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655798 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655817 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-combined-ca-bundle\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655835 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655888 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfp6b\" (UniqueName: \"kubernetes.io/projected/06d0f80a-f850-4078-8522-5e750e5d58eb-kube-api-access-kfp6b\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655907 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655966 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.655999 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c95d52-2520-4a9f-b32b-9023caca3572-logs\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656016 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpv95\" (UniqueName: \"kubernetes.io/projected/d5c95d52-2520-4a9f-b32b-9023caca3572-kube-api-access-mpv95\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656050 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data-custom\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656068 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656117 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxsw\" (UniqueName: \"kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656149 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d0f80a-f850-4078-8522-5e750e5d58eb-logs\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656172 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e322cd3d-c091-4958-92fd-65512870096f-logs\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656207 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-combined-ca-bundle\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656232 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data-custom\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.656249 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.660638 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e322cd3d-c091-4958-92fd-65512870096f-logs\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.660924 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5c95d52-2520-4a9f-b32b-9023caca3572-logs\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.663084 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-combined-ca-bundle\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.664244 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.666310 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.676588 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-config-data-custom\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.676673 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c95d52-2520-4a9f-b32b-9023caca3572-combined-ca-bundle\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.676625 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e322cd3d-c091-4958-92fd-65512870096f-config-data-custom\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.679443 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpv95\" (UniqueName: \"kubernetes.io/projected/d5c95d52-2520-4a9f-b32b-9023caca3572-kube-api-access-mpv95\") pod \"barbican-worker-5cd5655f5f-5w68p\" (UID: \"d5c95d52-2520-4a9f-b32b-9023caca3572\") " pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.679449 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjjqh\" (UniqueName: \"kubernetes.io/projected/e322cd3d-c091-4958-92fd-65512870096f-kube-api-access-zjjqh\") pod \"barbican-keystone-listener-68c6d787b6-lttfq\" (UID: \"e322cd3d-c091-4958-92fd-65512870096f\") " pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.742185 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5cd5655f5f-5w68p" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758328 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758386 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data-custom\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758407 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-combined-ca-bundle\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758429 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758476 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758548 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfp6b\" (UniqueName: \"kubernetes.io/projected/06d0f80a-f850-4078-8522-5e750e5d58eb-kube-api-access-kfp6b\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758567 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758607 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758690 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxsw\" (UniqueName: \"kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.758728 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d0f80a-f850-4078-8522-5e750e5d58eb-logs\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.759479 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d0f80a-f850-4078-8522-5e750e5d58eb-logs\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.760210 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.761281 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.762013 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.763345 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.763791 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data-custom\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.765284 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-combined-ca-bundle\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.766530 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d0f80a-f850-4078-8522-5e750e5d58eb-config-data\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.768852 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.782691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfp6b\" (UniqueName: \"kubernetes.io/projected/06d0f80a-f850-4078-8522-5e750e5d58eb-kube-api-access-kfp6b\") pod \"barbican-api-7447b48946-z557b\" (UID: \"06d0f80a-f850-4078-8522-5e750e5d58eb\") " pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.782730 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxsw\" (UniqueName: \"kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw\") pod \"dnsmasq-dns-77f8c95469-59cqh\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.823143 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:08:50 crc kubenswrapper[4939]: I0318 17:08:50.919305 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:51 crc kubenswrapper[4939]: I0318 17:08:51.477599 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5cd5655f5f-5w68p"] Mar 18 17:08:51 crc kubenswrapper[4939]: I0318 17:08:51.581489 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68c6d787b6-lttfq"] Mar 18 17:08:51 crc kubenswrapper[4939]: I0318 17:08:51.590932 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:08:51 crc kubenswrapper[4939]: I0318 17:08:51.686693 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7447b48946-z557b"] Mar 18 17:08:51 crc kubenswrapper[4939]: W0318 17:08:51.710588 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06d0f80a_f850_4078_8522_5e750e5d58eb.slice/crio-f120dacfe9e312978d8708a9080876ced0fbffd0b5da72d6ee56696c835c5df1 WatchSource:0}: Error finding container f120dacfe9e312978d8708a9080876ced0fbffd0b5da72d6ee56696c835c5df1: Status 404 returned error can't find the container with id f120dacfe9e312978d8708a9080876ced0fbffd0b5da72d6ee56696c835c5df1 Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.190211 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" event={"ID":"e322cd3d-c091-4958-92fd-65512870096f","Type":"ContainerStarted","Data":"3e034652d32e7768bccb201efa6960617142abf486d3ca8f6d20cce6ec13afb6"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.190666 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" event={"ID":"e322cd3d-c091-4958-92fd-65512870096f","Type":"ContainerStarted","Data":"e252a636eafee24cbba94930fd9223467211aef82bff51f53a111394afa7c32f"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.190681 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" event={"ID":"e322cd3d-c091-4958-92fd-65512870096f","Type":"ContainerStarted","Data":"386c36c8d8bc9e0197762c5701ea291f02205ae320147db2e5a5a3f6fa3a9e09"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.193390 4939 generic.go:334] "Generic (PLEG): container finished" podID="a3256064-cf90-48a9-a395-727d51ded17b" containerID="4e39f3ded14f69181450a818ae5f35337c1f3560a133287b698e6539c066d697" exitCode=0 Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.194114 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" event={"ID":"a3256064-cf90-48a9-a395-727d51ded17b","Type":"ContainerDied","Data":"4e39f3ded14f69181450a818ae5f35337c1f3560a133287b698e6539c066d697"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.194160 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" event={"ID":"a3256064-cf90-48a9-a395-727d51ded17b","Type":"ContainerStarted","Data":"64db100e3bd65419786610a034dd9fb8e22f115e76df431933dc785441ab22b0"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.197360 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cd5655f5f-5w68p" event={"ID":"d5c95d52-2520-4a9f-b32b-9023caca3572","Type":"ContainerStarted","Data":"f4b601a33d44a45760678b817c2b70a292e25d31db65ada2343395a86713598c"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.197396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cd5655f5f-5w68p" event={"ID":"d5c95d52-2520-4a9f-b32b-9023caca3572","Type":"ContainerStarted","Data":"684e2811fdae1674954ed1eecc26d2358d61bc8366f9566ac0c9fe24d2084784"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.197408 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cd5655f5f-5w68p" event={"ID":"d5c95d52-2520-4a9f-b32b-9023caca3572","Type":"ContainerStarted","Data":"1d96285e5fa59b5df3ca775cbeee8cf86e639096704fbc9b23262caf43c1031d"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.200134 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7447b48946-z557b" event={"ID":"06d0f80a-f850-4078-8522-5e750e5d58eb","Type":"ContainerStarted","Data":"5140c726081438a2915198ee842534c9a8510aeff1d3551f3987af6ec70f79f8"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.200161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7447b48946-z557b" event={"ID":"06d0f80a-f850-4078-8522-5e750e5d58eb","Type":"ContainerStarted","Data":"f09b0034c43fc6d2b7827bf878f83b0f3091c1584d28d3acc1d93a8252e7e347"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.200174 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7447b48946-z557b" event={"ID":"06d0f80a-f850-4078-8522-5e750e5d58eb","Type":"ContainerStarted","Data":"f120dacfe9e312978d8708a9080876ced0fbffd0b5da72d6ee56696c835c5df1"} Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.200701 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.200726 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.220907 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-68c6d787b6-lttfq" podStartSLOduration=2.220880775 podStartE2EDuration="2.220880775s" podCreationTimestamp="2026-03-18 17:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:52.210766138 +0000 UTC m=+5496.809953759" watchObservedRunningTime="2026-03-18 17:08:52.220880775 +0000 UTC m=+5496.820068396" Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.235139 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5cd5655f5f-5w68p" podStartSLOduration=2.235118009 podStartE2EDuration="2.235118009s" podCreationTimestamp="2026-03-18 17:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:52.235109249 +0000 UTC m=+5496.834296880" watchObservedRunningTime="2026-03-18 17:08:52.235118009 +0000 UTC m=+5496.834305630" Mar 18 17:08:52 crc kubenswrapper[4939]: I0318 17:08:52.286680 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7447b48946-z557b" podStartSLOduration=2.286653371 podStartE2EDuration="2.286653371s" podCreationTimestamp="2026-03-18 17:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:52.282037571 +0000 UTC m=+5496.881225202" watchObservedRunningTime="2026-03-18 17:08:52.286653371 +0000 UTC m=+5496.885840992" Mar 18 17:08:53 crc kubenswrapper[4939]: I0318 17:08:53.133411 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:08:53 crc kubenswrapper[4939]: E0318 17:08:53.133885 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:08:53 crc kubenswrapper[4939]: I0318 17:08:53.210098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" event={"ID":"a3256064-cf90-48a9-a395-727d51ded17b","Type":"ContainerStarted","Data":"e35c986d7f11897b733e6a9819879eef97d8cea434cd0c9890f47c9588fa1d16"} Mar 18 17:08:53 crc kubenswrapper[4939]: I0318 17:08:53.235837 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" podStartSLOduration=3.235816317 podStartE2EDuration="3.235816317s" podCreationTimestamp="2026-03-18 17:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:53.229607551 +0000 UTC m=+5497.828795172" watchObservedRunningTime="2026-03-18 17:08:53.235816317 +0000 UTC m=+5497.835003938" Mar 18 17:08:54 crc kubenswrapper[4939]: I0318 17:08:54.221663 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:09:00 crc kubenswrapper[4939]: I0318 17:09:00.825997 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:09:00 crc kubenswrapper[4939]: I0318 17:09:00.851540 4939 scope.go:117] "RemoveContainer" containerID="9c9e1a136379cebac60edec2b07b8959fc2de2c7897b29c909c1c5dbce87bd97" Mar 18 17:09:00 crc kubenswrapper[4939]: I0318 17:09:00.922829 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:09:00 crc kubenswrapper[4939]: I0318 17:09:00.923088 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="dnsmasq-dns" containerID="cri-o://86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1" gracePeriod=10 Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.411767 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.520519 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config\") pod \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.520597 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb\") pod \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.520848 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v77kc\" (UniqueName: \"kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc\") pod \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.520877 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc\") pod \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.520923 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb\") pod \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\" (UID: \"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5\") " Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.534557 4939 generic.go:334] "Generic (PLEG): container finished" podID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerID="86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1" exitCode=0 Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.534607 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" event={"ID":"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5","Type":"ContainerDied","Data":"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1"} Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.534642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" event={"ID":"2c9c59f0-a37f-4f2c-bcde-c5197897d8a5","Type":"ContainerDied","Data":"57bd78511837b97261bff27d3b721248932ccd8e43f24e84926f974d82aaec16"} Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.534667 4939 scope.go:117] "RemoveContainer" containerID="86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.534851 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-7m2l7" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.549097 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc" (OuterVolumeSpecName: "kube-api-access-v77kc") pod "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" (UID: "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5"). InnerVolumeSpecName "kube-api-access-v77kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.567043 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" (UID: "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.567241 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" (UID: "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.572707 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config" (OuterVolumeSpecName: "config") pod "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" (UID: "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.573397 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" (UID: "2c9c59f0-a37f-4f2c-bcde-c5197897d8a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.581250 4939 scope.go:117] "RemoveContainer" containerID="9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.608751 4939 scope.go:117] "RemoveContainer" containerID="86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1" Mar 18 17:09:01 crc kubenswrapper[4939]: E0318 17:09:01.609143 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1\": container with ID starting with 86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1 not found: ID does not exist" containerID="86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.609187 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1"} err="failed to get container status \"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1\": rpc error: code = NotFound desc = could not find container \"86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1\": container with ID starting with 86cd257f0aa3bd6cc8f5f8745d9975f974d965d173249bfee2a120f6bbf249d1 not found: ID does not exist" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.609216 4939 scope.go:117] "RemoveContainer" containerID="9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af" Mar 18 17:09:01 crc kubenswrapper[4939]: E0318 17:09:01.609522 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af\": container with ID starting with 9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af not found: ID does not exist" containerID="9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.609545 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af"} err="failed to get container status \"9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af\": rpc error: code = NotFound desc = could not find container \"9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af\": container with ID starting with 9f606a56fc54aac1c72dee9986a974a4644247c5fe0b2f9b5e434cec0c2c24af not found: ID does not exist" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.622343 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v77kc\" (UniqueName: \"kubernetes.io/projected/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-kube-api-access-v77kc\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.622389 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.622402 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.622411 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.622419 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.870116 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:09:01 crc kubenswrapper[4939]: I0318 17:09:01.876676 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-7m2l7"] Mar 18 17:09:02 crc kubenswrapper[4939]: I0318 17:09:02.146345 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" path="/var/lib/kubelet/pods/2c9c59f0-a37f-4f2c-bcde-c5197897d8a5/volumes" Mar 18 17:09:02 crc kubenswrapper[4939]: I0318 17:09:02.388046 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:09:02 crc kubenswrapper[4939]: I0318 17:09:02.415072 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7447b48946-z557b" Mar 18 17:09:07 crc kubenswrapper[4939]: I0318 17:09:07.133260 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:09:07 crc kubenswrapper[4939]: E0318 17:09:07.134146 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.830087 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-s8576"] Mar 18 17:09:13 crc kubenswrapper[4939]: E0318 17:09:13.831940 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="dnsmasq-dns" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.831976 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="dnsmasq-dns" Mar 18 17:09:13 crc kubenswrapper[4939]: E0318 17:09:13.832008 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="init" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.832020 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="init" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.832295 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9c59f0-a37f-4f2c-bcde-c5197897d8a5" containerName="dnsmasq-dns" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.833286 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s8576" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.839907 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s8576"] Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.930942 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f33d-account-create-update-ph9qq"] Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.931997 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.937110 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.942335 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f33d-account-create-update-ph9qq"] Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.952767 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:13 crc kubenswrapper[4939]: I0318 17:09:13.952920 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjl9x\" (UniqueName: \"kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.054174 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjl9x\" (UniqueName: \"kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.054306 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.054333 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.054384 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmxs\" (UniqueName: \"kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.055191 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.073129 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjl9x\" (UniqueName: \"kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x\") pod \"neutron-db-create-s8576\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.152254 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s8576" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.155875 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmxs\" (UniqueName: \"kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.156018 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.156833 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.172185 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmxs\" (UniqueName: \"kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs\") pod \"neutron-f33d-account-create-update-ph9qq\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.249380 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.668580 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s8576"] Mar 18 17:09:14 crc kubenswrapper[4939]: W0318 17:09:14.680376 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd143b2a0_8e7a_429f_b7ac_48969f1d48da.slice/crio-e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9 WatchSource:0}: Error finding container e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9: Status 404 returned error can't find the container with id e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9 Mar 18 17:09:14 crc kubenswrapper[4939]: I0318 17:09:14.788017 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f33d-account-create-update-ph9qq"] Mar 18 17:09:14 crc kubenswrapper[4939]: W0318 17:09:14.790833 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod325614c7_c425_4bfa_864b_142d5012d86e.slice/crio-cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e WatchSource:0}: Error finding container cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e: Status 404 returned error can't find the container with id cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.656945 4939 generic.go:334] "Generic (PLEG): container finished" podID="d143b2a0-8e7a-429f-b7ac-48969f1d48da" containerID="371859a9a1978286ff7b4748b354e0da4888881a4cf487f9d10229626d6ffe77" exitCode=0 Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.657371 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s8576" event={"ID":"d143b2a0-8e7a-429f-b7ac-48969f1d48da","Type":"ContainerDied","Data":"371859a9a1978286ff7b4748b354e0da4888881a4cf487f9d10229626d6ffe77"} Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.657397 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s8576" event={"ID":"d143b2a0-8e7a-429f-b7ac-48969f1d48da","Type":"ContainerStarted","Data":"e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9"} Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.659110 4939 generic.go:334] "Generic (PLEG): container finished" podID="325614c7-c425-4bfa-864b-142d5012d86e" containerID="43a9b19a8415798264b80b3011b291ee0f50c31f66ac74c7ba12c96bb6f3e3ac" exitCode=0 Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.659135 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f33d-account-create-update-ph9qq" event={"ID":"325614c7-c425-4bfa-864b-142d5012d86e","Type":"ContainerDied","Data":"43a9b19a8415798264b80b3011b291ee0f50c31f66ac74c7ba12c96bb6f3e3ac"} Mar 18 17:09:15 crc kubenswrapper[4939]: I0318 17:09:15.659152 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f33d-account-create-update-ph9qq" event={"ID":"325614c7-c425-4bfa-864b-142d5012d86e","Type":"ContainerStarted","Data":"cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e"} Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.026122 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.031818 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s8576" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.132287 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjl9x\" (UniqueName: \"kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x\") pod \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.132375 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts\") pod \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\" (UID: \"d143b2a0-8e7a-429f-b7ac-48969f1d48da\") " Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.132535 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts\") pod \"325614c7-c425-4bfa-864b-142d5012d86e\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.132579 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzmxs\" (UniqueName: \"kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs\") pod \"325614c7-c425-4bfa-864b-142d5012d86e\" (UID: \"325614c7-c425-4bfa-864b-142d5012d86e\") " Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.132930 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d143b2a0-8e7a-429f-b7ac-48969f1d48da" (UID: "d143b2a0-8e7a-429f-b7ac-48969f1d48da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.133275 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "325614c7-c425-4bfa-864b-142d5012d86e" (UID: "325614c7-c425-4bfa-864b-142d5012d86e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.140639 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x" (OuterVolumeSpecName: "kube-api-access-wjl9x") pod "d143b2a0-8e7a-429f-b7ac-48969f1d48da" (UID: "d143b2a0-8e7a-429f-b7ac-48969f1d48da"). InnerVolumeSpecName "kube-api-access-wjl9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.140758 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs" (OuterVolumeSpecName: "kube-api-access-zzmxs") pod "325614c7-c425-4bfa-864b-142d5012d86e" (UID: "325614c7-c425-4bfa-864b-142d5012d86e"). InnerVolumeSpecName "kube-api-access-zzmxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.234661 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjl9x\" (UniqueName: \"kubernetes.io/projected/d143b2a0-8e7a-429f-b7ac-48969f1d48da-kube-api-access-wjl9x\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.234713 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d143b2a0-8e7a-429f-b7ac-48969f1d48da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.234727 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325614c7-c425-4bfa-864b-142d5012d86e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.234738 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzmxs\" (UniqueName: \"kubernetes.io/projected/325614c7-c425-4bfa-864b-142d5012d86e-kube-api-access-zzmxs\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.677917 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s8576" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.677909 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s8576" event={"ID":"d143b2a0-8e7a-429f-b7ac-48969f1d48da","Type":"ContainerDied","Data":"e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9"} Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.679217 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3d4104c3493d72cc8b88f5ba7f4b43e053ca5eace61676c3568d5c9b936bef9" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.679742 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f33d-account-create-update-ph9qq" event={"ID":"325614c7-c425-4bfa-864b-142d5012d86e","Type":"ContainerDied","Data":"cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e"} Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.679785 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3018c168645d148b3ee722c9876d80da6dd12afe7fb64d37ef83f6df322c3e" Mar 18 17:09:17 crc kubenswrapper[4939]: I0318 17:09:17.679789 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f33d-account-create-update-ph9qq" Mar 18 17:09:18 crc kubenswrapper[4939]: I0318 17:09:18.133399 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:09:18 crc kubenswrapper[4939]: E0318 17:09:18.133664 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.227834 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pdcb8"] Mar 18 17:09:19 crc kubenswrapper[4939]: E0318 17:09:19.228499 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d143b2a0-8e7a-429f-b7ac-48969f1d48da" containerName="mariadb-database-create" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.228537 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d143b2a0-8e7a-429f-b7ac-48969f1d48da" containerName="mariadb-database-create" Mar 18 17:09:19 crc kubenswrapper[4939]: E0318 17:09:19.228575 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325614c7-c425-4bfa-864b-142d5012d86e" containerName="mariadb-account-create-update" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.228583 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="325614c7-c425-4bfa-864b-142d5012d86e" containerName="mariadb-account-create-update" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.228780 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="325614c7-c425-4bfa-864b-142d5012d86e" containerName="mariadb-account-create-update" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.228805 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d143b2a0-8e7a-429f-b7ac-48969f1d48da" containerName="mariadb-database-create" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.229434 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.231583 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lv8gn" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.233066 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.234563 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.243580 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pdcb8"] Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.375018 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.375060 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.375174 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb57b\" (UniqueName: \"kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.477307 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb57b\" (UniqueName: \"kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.478218 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.478254 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.488802 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.488908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.498188 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb57b\" (UniqueName: \"kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b\") pod \"neutron-db-sync-pdcb8\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:19 crc kubenswrapper[4939]: I0318 17:09:19.550134 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:20 crc kubenswrapper[4939]: I0318 17:09:20.002596 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pdcb8"] Mar 18 17:09:20 crc kubenswrapper[4939]: I0318 17:09:20.719262 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pdcb8" event={"ID":"b78edb94-7549-4392-b989-2373d238e37b","Type":"ContainerStarted","Data":"9868d634f98d06c0da552b82eccc0186d5727b6173d485d382ba024b5b2274fc"} Mar 18 17:09:20 crc kubenswrapper[4939]: I0318 17:09:20.719575 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pdcb8" event={"ID":"b78edb94-7549-4392-b989-2373d238e37b","Type":"ContainerStarted","Data":"3c177cc17517bcd55fbf7b61c29cfe449987a12bf2a4da5f195d84eab5183980"} Mar 18 17:09:20 crc kubenswrapper[4939]: I0318 17:09:20.739253 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pdcb8" podStartSLOduration=1.7392344149999999 podStartE2EDuration="1.739234415s" podCreationTimestamp="2026-03-18 17:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:09:20.731384892 +0000 UTC m=+5525.330572523" watchObservedRunningTime="2026-03-18 17:09:20.739234415 +0000 UTC m=+5525.338422046" Mar 18 17:09:25 crc kubenswrapper[4939]: I0318 17:09:25.764759 4939 generic.go:334] "Generic (PLEG): container finished" podID="b78edb94-7549-4392-b989-2373d238e37b" containerID="9868d634f98d06c0da552b82eccc0186d5727b6173d485d382ba024b5b2274fc" exitCode=0 Mar 18 17:09:25 crc kubenswrapper[4939]: I0318 17:09:25.764895 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pdcb8" event={"ID":"b78edb94-7549-4392-b989-2373d238e37b","Type":"ContainerDied","Data":"9868d634f98d06c0da552b82eccc0186d5727b6173d485d382ba024b5b2274fc"} Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.169208 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.229620 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle\") pod \"b78edb94-7549-4392-b989-2373d238e37b\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.229827 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config\") pod \"b78edb94-7549-4392-b989-2373d238e37b\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.229970 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb57b\" (UniqueName: \"kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b\") pod \"b78edb94-7549-4392-b989-2373d238e37b\" (UID: \"b78edb94-7549-4392-b989-2373d238e37b\") " Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.234398 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b" (OuterVolumeSpecName: "kube-api-access-kb57b") pod "b78edb94-7549-4392-b989-2373d238e37b" (UID: "b78edb94-7549-4392-b989-2373d238e37b"). InnerVolumeSpecName "kube-api-access-kb57b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.251619 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config" (OuterVolumeSpecName: "config") pod "b78edb94-7549-4392-b989-2373d238e37b" (UID: "b78edb94-7549-4392-b989-2373d238e37b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.255178 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b78edb94-7549-4392-b989-2373d238e37b" (UID: "b78edb94-7549-4392-b989-2373d238e37b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.332258 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.332293 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b78edb94-7549-4392-b989-2373d238e37b-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.332307 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb57b\" (UniqueName: \"kubernetes.io/projected/b78edb94-7549-4392-b989-2373d238e37b-kube-api-access-kb57b\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.787013 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pdcb8" event={"ID":"b78edb94-7549-4392-b989-2373d238e37b","Type":"ContainerDied","Data":"3c177cc17517bcd55fbf7b61c29cfe449987a12bf2a4da5f195d84eab5183980"} Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.787089 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c177cc17517bcd55fbf7b61c29cfe449987a12bf2a4da5f195d84eab5183980" Mar 18 17:09:27 crc kubenswrapper[4939]: I0318 17:09:27.787098 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pdcb8" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.069018 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:09:28 crc kubenswrapper[4939]: E0318 17:09:28.069433 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78edb94-7549-4392-b989-2373d238e37b" containerName="neutron-db-sync" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.069456 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78edb94-7549-4392-b989-2373d238e37b" containerName="neutron-db-sync" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.069666 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78edb94-7549-4392-b989-2373d238e37b" containerName="neutron-db-sync" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.071962 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.083883 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.149693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.149768 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.149824 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.149862 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.149970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52rkt\" (UniqueName: \"kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.251757 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.252465 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.252668 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.252956 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52rkt\" (UniqueName: \"kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.252836 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.253148 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.253491 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.256526 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.256598 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.290951 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bcf85947c-k99kg"] Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.292352 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.298280 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.298642 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.299354 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lv8gn" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.299648 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52rkt\" (UniqueName: \"kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt\") pod \"dnsmasq-dns-7c7fcc54fc-nnhzk\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.304854 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bcf85947c-k99kg"] Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.355104 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-httpd-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.355202 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-combined-ca-bundle\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.355339 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.355372 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzjq\" (UniqueName: \"kubernetes.io/projected/859a941c-b861-4d33-9bcc-816f56b24c41-kube-api-access-hnzjq\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.423918 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.456577 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.457756 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzjq\" (UniqueName: \"kubernetes.io/projected/859a941c-b861-4d33-9bcc-816f56b24c41-kube-api-access-hnzjq\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.457804 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-httpd-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.457916 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-combined-ca-bundle\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.462872 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-httpd-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.466106 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-config\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.469080 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/859a941c-b861-4d33-9bcc-816f56b24c41-combined-ca-bundle\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.479119 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzjq\" (UniqueName: \"kubernetes.io/projected/859a941c-b861-4d33-9bcc-816f56b24c41-kube-api-access-hnzjq\") pod \"neutron-7bcf85947c-k99kg\" (UID: \"859a941c-b861-4d33-9bcc-816f56b24c41\") " pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.641994 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:28 crc kubenswrapper[4939]: I0318 17:09:28.889517 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.219477 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bcf85947c-k99kg"] Mar 18 17:09:29 crc kubenswrapper[4939]: W0318 17:09:29.221453 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod859a941c_b861_4d33_9bcc_816f56b24c41.slice/crio-41a4ce2dd951478981e5be4faf5ba18ad0c0ae7eaa15a1f6fc18e115a4e1fdcf WatchSource:0}: Error finding container 41a4ce2dd951478981e5be4faf5ba18ad0c0ae7eaa15a1f6fc18e115a4e1fdcf: Status 404 returned error can't find the container with id 41a4ce2dd951478981e5be4faf5ba18ad0c0ae7eaa15a1f6fc18e115a4e1fdcf Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.816214 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcf85947c-k99kg" event={"ID":"859a941c-b861-4d33-9bcc-816f56b24c41","Type":"ContainerStarted","Data":"89a4fbf75b032a77f428a190c165c84e1214f08a91d7c5d9f81cd4c3d69f892d"} Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.816556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcf85947c-k99kg" event={"ID":"859a941c-b861-4d33-9bcc-816f56b24c41","Type":"ContainerStarted","Data":"82172609c00fe0cf7301ea2395edb7644f3a0833c9bbd3368261c49bc431db99"} Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.816571 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcf85947c-k99kg" event={"ID":"859a941c-b861-4d33-9bcc-816f56b24c41","Type":"ContainerStarted","Data":"41a4ce2dd951478981e5be4faf5ba18ad0c0ae7eaa15a1f6fc18e115a4e1fdcf"} Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.816612 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.820451 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerID="82363eeeb9e4cb68352def813b5535d5bcfbcaa232ccd17f35126b0bb02055ba" exitCode=0 Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.820607 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" event={"ID":"ca6647b2-4fde-4b54-8ca0-768948dee0be","Type":"ContainerDied","Data":"82363eeeb9e4cb68352def813b5535d5bcfbcaa232ccd17f35126b0bb02055ba"} Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.820670 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" event={"ID":"ca6647b2-4fde-4b54-8ca0-768948dee0be","Type":"ContainerStarted","Data":"fe8aed7b3e38c59dc274d1c39afcc7391a9227535853375df8ba49eead317a4f"} Mar 18 17:09:29 crc kubenswrapper[4939]: I0318 17:09:29.842386 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bcf85947c-k99kg" podStartSLOduration=1.842367223 podStartE2EDuration="1.842367223s" podCreationTimestamp="2026-03-18 17:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:09:29.833050869 +0000 UTC m=+5534.432238490" watchObservedRunningTime="2026-03-18 17:09:29.842367223 +0000 UTC m=+5534.441554844" Mar 18 17:09:30 crc kubenswrapper[4939]: I0318 17:09:30.829677 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" event={"ID":"ca6647b2-4fde-4b54-8ca0-768948dee0be","Type":"ContainerStarted","Data":"7df96ab182e89ae8cd1ed7cb454b7c0711c60c788d05723b0dbbca95c321e19c"} Mar 18 17:09:30 crc kubenswrapper[4939]: I0318 17:09:30.830346 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:30 crc kubenswrapper[4939]: I0318 17:09:30.855594 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" podStartSLOduration=2.855570786 podStartE2EDuration="2.855570786s" podCreationTimestamp="2026-03-18 17:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:09:30.848450094 +0000 UTC m=+5535.447637715" watchObservedRunningTime="2026-03-18 17:09:30.855570786 +0000 UTC m=+5535.454758407" Mar 18 17:09:31 crc kubenswrapper[4939]: I0318 17:09:31.133094 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:09:31 crc kubenswrapper[4939]: I0318 17:09:31.838384 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1"} Mar 18 17:09:38 crc kubenswrapper[4939]: I0318 17:09:38.425489 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:09:38 crc kubenswrapper[4939]: I0318 17:09:38.514750 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:09:38 crc kubenswrapper[4939]: I0318 17:09:38.515019 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="dnsmasq-dns" containerID="cri-o://e35c986d7f11897b733e6a9819879eef97d8cea434cd0c9890f47c9588fa1d16" gracePeriod=10 Mar 18 17:09:38 crc kubenswrapper[4939]: I0318 17:09:38.906028 4939 generic.go:334] "Generic (PLEG): container finished" podID="a3256064-cf90-48a9-a395-727d51ded17b" containerID="e35c986d7f11897b733e6a9819879eef97d8cea434cd0c9890f47c9588fa1d16" exitCode=0 Mar 18 17:09:38 crc kubenswrapper[4939]: I0318 17:09:38.906067 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" event={"ID":"a3256064-cf90-48a9-a395-727d51ded17b","Type":"ContainerDied","Data":"e35c986d7f11897b733e6a9819879eef97d8cea434cd0c9890f47c9588fa1d16"} Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.013648 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.158351 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdxsw\" (UniqueName: \"kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw\") pod \"a3256064-cf90-48a9-a395-727d51ded17b\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.158695 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb\") pod \"a3256064-cf90-48a9-a395-727d51ded17b\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.158814 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb\") pod \"a3256064-cf90-48a9-a395-727d51ded17b\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.158965 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc\") pod \"a3256064-cf90-48a9-a395-727d51ded17b\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.159119 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config\") pod \"a3256064-cf90-48a9-a395-727d51ded17b\" (UID: \"a3256064-cf90-48a9-a395-727d51ded17b\") " Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.165122 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw" (OuterVolumeSpecName: "kube-api-access-fdxsw") pod "a3256064-cf90-48a9-a395-727d51ded17b" (UID: "a3256064-cf90-48a9-a395-727d51ded17b"). InnerVolumeSpecName "kube-api-access-fdxsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.196727 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config" (OuterVolumeSpecName: "config") pod "a3256064-cf90-48a9-a395-727d51ded17b" (UID: "a3256064-cf90-48a9-a395-727d51ded17b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.201696 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3256064-cf90-48a9-a395-727d51ded17b" (UID: "a3256064-cf90-48a9-a395-727d51ded17b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.202070 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3256064-cf90-48a9-a395-727d51ded17b" (UID: "a3256064-cf90-48a9-a395-727d51ded17b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.212902 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3256064-cf90-48a9-a395-727d51ded17b" (UID: "a3256064-cf90-48a9-a395-727d51ded17b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.261296 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.261338 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.261349 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.261360 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdxsw\" (UniqueName: \"kubernetes.io/projected/a3256064-cf90-48a9-a395-727d51ded17b-kube-api-access-fdxsw\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.261374 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3256064-cf90-48a9-a395-727d51ded17b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.916469 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" event={"ID":"a3256064-cf90-48a9-a395-727d51ded17b","Type":"ContainerDied","Data":"64db100e3bd65419786610a034dd9fb8e22f115e76df431933dc785441ab22b0"} Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.916566 4939 scope.go:117] "RemoveContainer" containerID="e35c986d7f11897b733e6a9819879eef97d8cea434cd0c9890f47c9588fa1d16" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.916577 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-59cqh" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.954717 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.956198 4939 scope.go:117] "RemoveContainer" containerID="4e39f3ded14f69181450a818ae5f35337c1f3560a133287b698e6539c066d697" Mar 18 17:09:39 crc kubenswrapper[4939]: I0318 17:09:39.962098 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-59cqh"] Mar 18 17:09:40 crc kubenswrapper[4939]: I0318 17:09:40.144963 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3256064-cf90-48a9-a395-727d51ded17b" path="/var/lib/kubelet/pods/a3256064-cf90-48a9-a395-727d51ded17b/volumes" Mar 18 17:09:58 crc kubenswrapper[4939]: I0318 17:09:58.652023 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bcf85947c-k99kg" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.196351 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564230-d559w"] Mar 18 17:10:00 crc kubenswrapper[4939]: E0318 17:10:00.197136 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="dnsmasq-dns" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.197152 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="dnsmasq-dns" Mar 18 17:10:00 crc kubenswrapper[4939]: E0318 17:10:00.197165 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="init" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.197173 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="init" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.197396 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3256064-cf90-48a9-a395-727d51ded17b" containerName="dnsmasq-dns" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.198245 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.200463 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.200758 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.200892 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.211333 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-d559w"] Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.312203 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkpt\" (UniqueName: \"kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt\") pod \"auto-csr-approver-29564230-d559w\" (UID: \"6dc59a41-1322-4779-b5db-07b3d58253ff\") " pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.413230 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkpt\" (UniqueName: \"kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt\") pod \"auto-csr-approver-29564230-d559w\" (UID: \"6dc59a41-1322-4779-b5db-07b3d58253ff\") " pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.455012 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkpt\" (UniqueName: \"kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt\") pod \"auto-csr-approver-29564230-d559w\" (UID: \"6dc59a41-1322-4779-b5db-07b3d58253ff\") " pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.521686 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:00 crc kubenswrapper[4939]: I0318 17:10:00.953462 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-d559w"] Mar 18 17:10:01 crc kubenswrapper[4939]: I0318 17:10:01.140494 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-d559w" event={"ID":"6dc59a41-1322-4779-b5db-07b3d58253ff","Type":"ContainerStarted","Data":"9838a90cd2734fb49833fd43a95bd7da6f3d858f3be18155ef81e9c5948867e3"} Mar 18 17:10:02 crc kubenswrapper[4939]: E0318 17:10:02.988070 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dc59a41_1322_4779_b5db_07b3d58253ff.slice/crio-581fcb91747891802ddd8b601ff1828e2326878661ff58b49f79189540eeefc5.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:10:03 crc kubenswrapper[4939]: I0318 17:10:03.166127 4939 generic.go:334] "Generic (PLEG): container finished" podID="6dc59a41-1322-4779-b5db-07b3d58253ff" containerID="581fcb91747891802ddd8b601ff1828e2326878661ff58b49f79189540eeefc5" exitCode=0 Mar 18 17:10:03 crc kubenswrapper[4939]: I0318 17:10:03.166180 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-d559w" event={"ID":"6dc59a41-1322-4779-b5db-07b3d58253ff","Type":"ContainerDied","Data":"581fcb91747891802ddd8b601ff1828e2326878661ff58b49f79189540eeefc5"} Mar 18 17:10:04 crc kubenswrapper[4939]: E0318 17:10:04.262491 4939 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.227:34614->38.102.83.227:41597: read tcp 38.102.83.227:34614->38.102.83.227:41597: read: connection reset by peer Mar 18 17:10:04 crc kubenswrapper[4939]: I0318 17:10:04.474616 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:04 crc kubenswrapper[4939]: I0318 17:10:04.590010 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpkpt\" (UniqueName: \"kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt\") pod \"6dc59a41-1322-4779-b5db-07b3d58253ff\" (UID: \"6dc59a41-1322-4779-b5db-07b3d58253ff\") " Mar 18 17:10:04 crc kubenswrapper[4939]: I0318 17:10:04.596159 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt" (OuterVolumeSpecName: "kube-api-access-vpkpt") pod "6dc59a41-1322-4779-b5db-07b3d58253ff" (UID: "6dc59a41-1322-4779-b5db-07b3d58253ff"). InnerVolumeSpecName "kube-api-access-vpkpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:04 crc kubenswrapper[4939]: I0318 17:10:04.691576 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpkpt\" (UniqueName: \"kubernetes.io/projected/6dc59a41-1322-4779-b5db-07b3d58253ff-kube-api-access-vpkpt\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.187987 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-d559w" event={"ID":"6dc59a41-1322-4779-b5db-07b3d58253ff","Type":"ContainerDied","Data":"9838a90cd2734fb49833fd43a95bd7da6f3d858f3be18155ef81e9c5948867e3"} Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.188027 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9838a90cd2734fb49833fd43a95bd7da6f3d858f3be18155ef81e9c5948867e3" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.188027 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-d559w" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.546697 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-l54gq"] Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.555125 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-l54gq"] Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.637797 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xkw2j"] Mar 18 17:10:05 crc kubenswrapper[4939]: E0318 17:10:05.638197 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc59a41-1322-4779-b5db-07b3d58253ff" containerName="oc" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.638209 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc59a41-1322-4779-b5db-07b3d58253ff" containerName="oc" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.638354 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc59a41-1322-4779-b5db-07b3d58253ff" containerName="oc" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.638946 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.645186 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xkw2j"] Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.738933 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-77de-account-create-update-67wfn"] Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.740365 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.742619 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.751674 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-77de-account-create-update-67wfn"] Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.810226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.810364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8r8\" (UniqueName: \"kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.912232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92ms\" (UniqueName: \"kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.912305 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.912458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.912566 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8r8\" (UniqueName: \"kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.913776 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.938942 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8r8\" (UniqueName: \"kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8\") pod \"glance-db-create-xkw2j\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:05 crc kubenswrapper[4939]: I0318 17:10:05.967808 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.014981 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g92ms\" (UniqueName: \"kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.015096 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.016402 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.034012 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92ms\" (UniqueName: \"kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms\") pod \"glance-77de-account-create-update-67wfn\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.064481 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.152982 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b244dd-f9c0-477d-9e0e-04d9ce5231f5" path="/var/lib/kubelet/pods/40b244dd-f9c0-477d-9e0e-04d9ce5231f5/volumes" Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.452040 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xkw2j"] Mar 18 17:10:06 crc kubenswrapper[4939]: W0318 17:10:06.455233 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5880775f_1842_448f_b622_2805f5ac64f8.slice/crio-ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e WatchSource:0}: Error finding container ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e: Status 404 returned error can't find the container with id ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e Mar 18 17:10:06 crc kubenswrapper[4939]: I0318 17:10:06.565906 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-77de-account-create-update-67wfn"] Mar 18 17:10:06 crc kubenswrapper[4939]: W0318 17:10:06.568195 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0da004_51dd_4cd0_85a2_318e5ecf9f26.slice/crio-8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368 WatchSource:0}: Error finding container 8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368: Status 404 returned error can't find the container with id 8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368 Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.205937 4939 generic.go:334] "Generic (PLEG): container finished" podID="5880775f-1842-448f-b622-2805f5ac64f8" containerID="6d8dd1ef6381262cca50a5203e5c57d18962649a333bfc06807dcfe57c32c6e3" exitCode=0 Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.205992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xkw2j" event={"ID":"5880775f-1842-448f-b622-2805f5ac64f8","Type":"ContainerDied","Data":"6d8dd1ef6381262cca50a5203e5c57d18962649a333bfc06807dcfe57c32c6e3"} Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.206065 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xkw2j" event={"ID":"5880775f-1842-448f-b622-2805f5ac64f8","Type":"ContainerStarted","Data":"ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e"} Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.208584 4939 generic.go:334] "Generic (PLEG): container finished" podID="2a0da004-51dd-4cd0-85a2-318e5ecf9f26" containerID="7c986c7291a3c277d8a16a0e697f02a66767bf7db9e66e3429c6e3499438f4ad" exitCode=0 Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.208620 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-77de-account-create-update-67wfn" event={"ID":"2a0da004-51dd-4cd0-85a2-318e5ecf9f26","Type":"ContainerDied","Data":"7c986c7291a3c277d8a16a0e697f02a66767bf7db9e66e3429c6e3499438f4ad"} Mar 18 17:10:07 crc kubenswrapper[4939]: I0318 17:10:07.208649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-77de-account-create-update-67wfn" event={"ID":"2a0da004-51dd-4cd0-85a2-318e5ecf9f26","Type":"ContainerStarted","Data":"8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368"} Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.605589 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.611671 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.773563 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g92ms\" (UniqueName: \"kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms\") pod \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.773657 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts\") pod \"5880775f-1842-448f-b622-2805f5ac64f8\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.773748 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts\") pod \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\" (UID: \"2a0da004-51dd-4cd0-85a2-318e5ecf9f26\") " Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.773782 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8r8\" (UniqueName: \"kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8\") pod \"5880775f-1842-448f-b622-2805f5ac64f8\" (UID: \"5880775f-1842-448f-b622-2805f5ac64f8\") " Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.774517 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5880775f-1842-448f-b622-2805f5ac64f8" (UID: "5880775f-1842-448f-b622-2805f5ac64f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.774615 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a0da004-51dd-4cd0-85a2-318e5ecf9f26" (UID: "2a0da004-51dd-4cd0-85a2-318e5ecf9f26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.779127 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms" (OuterVolumeSpecName: "kube-api-access-g92ms") pod "2a0da004-51dd-4cd0-85a2-318e5ecf9f26" (UID: "2a0da004-51dd-4cd0-85a2-318e5ecf9f26"). InnerVolumeSpecName "kube-api-access-g92ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.786680 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8" (OuterVolumeSpecName: "kube-api-access-7j8r8") pod "5880775f-1842-448f-b622-2805f5ac64f8" (UID: "5880775f-1842-448f-b622-2805f5ac64f8"). InnerVolumeSpecName "kube-api-access-7j8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.875642 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5880775f-1842-448f-b622-2805f5ac64f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.875681 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.875699 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8r8\" (UniqueName: \"kubernetes.io/projected/5880775f-1842-448f-b622-2805f5ac64f8-kube-api-access-7j8r8\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:08 crc kubenswrapper[4939]: I0318 17:10:08.875890 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g92ms\" (UniqueName: \"kubernetes.io/projected/2a0da004-51dd-4cd0-85a2-318e5ecf9f26-kube-api-access-g92ms\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.231688 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-77de-account-create-update-67wfn" event={"ID":"2a0da004-51dd-4cd0-85a2-318e5ecf9f26","Type":"ContainerDied","Data":"8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368"} Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.231795 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb331003415cbcfb5c798b61c4213fb66584362dcdea459896c176f06a6e368" Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.231748 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-77de-account-create-update-67wfn" Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.233068 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xkw2j" event={"ID":"5880775f-1842-448f-b622-2805f5ac64f8","Type":"ContainerDied","Data":"ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e"} Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.233091 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xkw2j" Mar 18 17:10:09 crc kubenswrapper[4939]: I0318 17:10:09.233094 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac3fdc43ddaa330a822505ee89ff987f4a66ba42ff7322877ffeaa165555917e" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.907896 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8h7zl"] Mar 18 17:10:10 crc kubenswrapper[4939]: E0318 17:10:10.908450 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5880775f-1842-448f-b622-2805f5ac64f8" containerName="mariadb-database-create" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.908469 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5880775f-1842-448f-b622-2805f5ac64f8" containerName="mariadb-database-create" Mar 18 17:10:10 crc kubenswrapper[4939]: E0318 17:10:10.908484 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0da004-51dd-4cd0-85a2-318e5ecf9f26" containerName="mariadb-account-create-update" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.908491 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0da004-51dd-4cd0-85a2-318e5ecf9f26" containerName="mariadb-account-create-update" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.908694 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0da004-51dd-4cd0-85a2-318e5ecf9f26" containerName="mariadb-account-create-update" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.908719 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5880775f-1842-448f-b622-2805f5ac64f8" containerName="mariadb-database-create" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.909697 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.913744 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.914258 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sb77r" Mar 18 17:10:10 crc kubenswrapper[4939]: I0318 17:10:10.921821 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8h7zl"] Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.112830 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.113532 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvxq\" (UniqueName: \"kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.113740 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.113964 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.215515 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.215604 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvxq\" (UniqueName: \"kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.215684 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.215785 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.236944 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.237568 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.237786 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.241850 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvxq\" (UniqueName: \"kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq\") pod \"glance-db-sync-8h7zl\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:11 crc kubenswrapper[4939]: I0318 17:10:11.529280 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:12 crc kubenswrapper[4939]: I0318 17:10:12.096650 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gztwr"] Mar 18 17:10:12 crc kubenswrapper[4939]: I0318 17:10:12.107491 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gztwr"] Mar 18 17:10:12 crc kubenswrapper[4939]: I0318 17:10:12.146090 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbdd35a-4e99-4d2a-adf1-111ced1b750a" path="/var/lib/kubelet/pods/bcbdd35a-4e99-4d2a-adf1-111ced1b750a/volumes" Mar 18 17:10:12 crc kubenswrapper[4939]: I0318 17:10:12.174183 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8h7zl"] Mar 18 17:10:12 crc kubenswrapper[4939]: I0318 17:10:12.276898 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8h7zl" event={"ID":"38a5ef95-f146-4eec-ba54-12a925005e5f","Type":"ContainerStarted","Data":"fa77bb127ce3920c623d8dbce018ea4c38da465c01585ac7d271c955b572f127"} Mar 18 17:10:13 crc kubenswrapper[4939]: I0318 17:10:13.287789 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8h7zl" event={"ID":"38a5ef95-f146-4eec-ba54-12a925005e5f","Type":"ContainerStarted","Data":"f4ed2db25ef819027a234186a46e03c0d2ce37164fa1e77a7d79be485ad5d4c5"} Mar 18 17:10:13 crc kubenswrapper[4939]: I0318 17:10:13.311919 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8h7zl" podStartSLOduration=3.3118912959999998 podStartE2EDuration="3.311891296s" podCreationTimestamp="2026-03-18 17:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:13.303836517 +0000 UTC m=+5577.903024158" watchObservedRunningTime="2026-03-18 17:10:13.311891296 +0000 UTC m=+5577.911078927" Mar 18 17:10:16 crc kubenswrapper[4939]: I0318 17:10:16.312813 4939 generic.go:334] "Generic (PLEG): container finished" podID="38a5ef95-f146-4eec-ba54-12a925005e5f" containerID="f4ed2db25ef819027a234186a46e03c0d2ce37164fa1e77a7d79be485ad5d4c5" exitCode=0 Mar 18 17:10:16 crc kubenswrapper[4939]: I0318 17:10:16.312923 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8h7zl" event={"ID":"38a5ef95-f146-4eec-ba54-12a925005e5f","Type":"ContainerDied","Data":"f4ed2db25ef819027a234186a46e03c0d2ce37164fa1e77a7d79be485ad5d4c5"} Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.727006 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.847438 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle\") pod \"38a5ef95-f146-4eec-ba54-12a925005e5f\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.847632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvxq\" (UniqueName: \"kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq\") pod \"38a5ef95-f146-4eec-ba54-12a925005e5f\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.847669 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data\") pod \"38a5ef95-f146-4eec-ba54-12a925005e5f\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.847711 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data\") pod \"38a5ef95-f146-4eec-ba54-12a925005e5f\" (UID: \"38a5ef95-f146-4eec-ba54-12a925005e5f\") " Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.853415 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "38a5ef95-f146-4eec-ba54-12a925005e5f" (UID: "38a5ef95-f146-4eec-ba54-12a925005e5f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.853482 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq" (OuterVolumeSpecName: "kube-api-access-zvvxq") pod "38a5ef95-f146-4eec-ba54-12a925005e5f" (UID: "38a5ef95-f146-4eec-ba54-12a925005e5f"). InnerVolumeSpecName "kube-api-access-zvvxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.872097 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38a5ef95-f146-4eec-ba54-12a925005e5f" (UID: "38a5ef95-f146-4eec-ba54-12a925005e5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.899151 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data" (OuterVolumeSpecName: "config-data") pod "38a5ef95-f146-4eec-ba54-12a925005e5f" (UID: "38a5ef95-f146-4eec-ba54-12a925005e5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.950637 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvvxq\" (UniqueName: \"kubernetes.io/projected/38a5ef95-f146-4eec-ba54-12a925005e5f-kube-api-access-zvvxq\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.950675 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.950686 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:17 crc kubenswrapper[4939]: I0318 17:10:17.950695 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5ef95-f146-4eec-ba54-12a925005e5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.330148 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8h7zl" event={"ID":"38a5ef95-f146-4eec-ba54-12a925005e5f","Type":"ContainerDied","Data":"fa77bb127ce3920c623d8dbce018ea4c38da465c01585ac7d271c955b572f127"} Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.330431 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa77bb127ce3920c623d8dbce018ea4c38da465c01585ac7d271c955b572f127" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.330204 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8h7zl" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.673695 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:18 crc kubenswrapper[4939]: E0318 17:10:18.674337 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a5ef95-f146-4eec-ba54-12a925005e5f" containerName="glance-db-sync" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.674358 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a5ef95-f146-4eec-ba54-12a925005e5f" containerName="glance-db-sync" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.674644 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a5ef95-f146-4eec-ba54-12a925005e5f" containerName="glance-db-sync" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.675720 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.689113 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sb77r" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.689452 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.689688 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.689927 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.706665 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.844332 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.846102 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.869892 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.869930 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.869954 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.869970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.869992 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.870018 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.870055 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8w8\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.874480 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.959695 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.961273 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.966524 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972117 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972192 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972261 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972295 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972378 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972400 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972425 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972449 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972614 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972707 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972743 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972801 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8w8\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972911 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4zk\" (UniqueName: \"kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.972942 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.973000 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.973310 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.973322 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.977297 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.979905 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.991394 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:18 crc kubenswrapper[4939]: I0318 17:10:18.992928 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.002205 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.007105 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8w8\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8\") pod \"glance-default-external-api-0\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075571 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075617 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075644 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075658 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075691 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075711 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075755 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4zk\" (UniqueName: \"kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075770 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075794 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075815 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075839 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5q5\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.075872 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.076727 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.077025 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.077234 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.078240 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.078843 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.079385 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.086990 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.089580 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.090724 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.099116 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.124314 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4zk\" (UniqueName: \"kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk\") pod \"dnsmasq-dns-7c8c4d4d9c-ljhbx\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.165693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.178094 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5q5\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.236380 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5q5\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5\") pod \"glance-default-internal-api-0\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.307931 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.353837 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.712056 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:10:19 crc kubenswrapper[4939]: W0318 17:10:19.714847 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9193eb4a_4546_4c27_a575_05fd5bddacd9.slice/crio-45d914f55ed177f7e0359a1a027ef5b07da276a2cd461b6fc229271f211856af WatchSource:0}: Error finding container 45d914f55ed177f7e0359a1a027ef5b07da276a2cd461b6fc229271f211856af: Status 404 returned error can't find the container with id 45d914f55ed177f7e0359a1a027ef5b07da276a2cd461b6fc229271f211856af Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.915103 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:19 crc kubenswrapper[4939]: W0318 17:10:19.920993 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ff12186_94b7_4301_af56_f6959b82d462.slice/crio-bebb4c25d330ca613a695fbd5c21e830d3804d9c0d7dcd0034ac9085606b41b9 WatchSource:0}: Error finding container bebb4c25d330ca613a695fbd5c21e830d3804d9c0d7dcd0034ac9085606b41b9: Status 404 returned error can't find the container with id bebb4c25d330ca613a695fbd5c21e830d3804d9c0d7dcd0034ac9085606b41b9 Mar 18 17:10:19 crc kubenswrapper[4939]: I0318 17:10:19.950485 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.009413 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.361048 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerStarted","Data":"bebb4c25d330ca613a695fbd5c21e830d3804d9c0d7dcd0034ac9085606b41b9"} Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.362727 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerStarted","Data":"7560ee135e1da6a33cd94506f4c2991d841245cd864d3f35e1b0ed8e8b561127"} Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.364142 4939 generic.go:334] "Generic (PLEG): container finished" podID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerID="aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9" exitCode=0 Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.364179 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" event={"ID":"9193eb4a-4546-4c27-a575-05fd5bddacd9","Type":"ContainerDied","Data":"aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9"} Mar 18 17:10:20 crc kubenswrapper[4939]: I0318 17:10:20.364198 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" event={"ID":"9193eb4a-4546-4c27-a575-05fd5bddacd9","Type":"ContainerStarted","Data":"45d914f55ed177f7e0359a1a027ef5b07da276a2cd461b6fc229271f211856af"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.373085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" event={"ID":"9193eb4a-4546-4c27-a575-05fd5bddacd9","Type":"ContainerStarted","Data":"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.374635 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.376455 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerStarted","Data":"603817e142630d2851c5b5174dc5a61ec539fc4231700131a5079491288bbd72"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.376479 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerStarted","Data":"467b8d7600c33817045c72486f707ade5e572b7012fa358e6f5bc4d0bfbb5fe4"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.376590 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-log" containerID="cri-o://467b8d7600c33817045c72486f707ade5e572b7012fa358e6f5bc4d0bfbb5fe4" gracePeriod=30 Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.376832 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-httpd" containerID="cri-o://603817e142630d2851c5b5174dc5a61ec539fc4231700131a5079491288bbd72" gracePeriod=30 Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.386525 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerStarted","Data":"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.386731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerStarted","Data":"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583"} Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.423077 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" podStartSLOduration=3.423058903 podStartE2EDuration="3.423058903s" podCreationTimestamp="2026-03-18 17:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:21.400117092 +0000 UTC m=+5585.999304703" watchObservedRunningTime="2026-03-18 17:10:21.423058903 +0000 UTC m=+5586.022246524" Mar 18 17:10:21 crc kubenswrapper[4939]: I0318 17:10:21.425215 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.425203134 podStartE2EDuration="3.425203134s" podCreationTimestamp="2026-03-18 17:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:21.418542365 +0000 UTC m=+5586.017729986" watchObservedRunningTime="2026-03-18 17:10:21.425203134 +0000 UTC m=+5586.024390755" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.511737 4939 generic.go:334] "Generic (PLEG): container finished" podID="4ff12186-94b7-4301-af56-f6959b82d462" containerID="603817e142630d2851c5b5174dc5a61ec539fc4231700131a5079491288bbd72" exitCode=0 Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.512118 4939 generic.go:334] "Generic (PLEG): container finished" podID="4ff12186-94b7-4301-af56-f6959b82d462" containerID="467b8d7600c33817045c72486f707ade5e572b7012fa358e6f5bc4d0bfbb5fe4" exitCode=143 Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.511898 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerDied","Data":"603817e142630d2851c5b5174dc5a61ec539fc4231700131a5079491288bbd72"} Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.512221 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerDied","Data":"467b8d7600c33817045c72486f707ade5e572b7012fa358e6f5bc4d0bfbb5fe4"} Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.757649 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.784887 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.784872238 podStartE2EDuration="4.784872238s" podCreationTimestamp="2026-03-18 17:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:21.457322895 +0000 UTC m=+5586.056510516" watchObservedRunningTime="2026-03-18 17:10:22.784872238 +0000 UTC m=+5587.384059849" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799481 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799569 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799624 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799749 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799801 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799869 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.799940 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8w8\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8\") pod \"4ff12186-94b7-4301-af56-f6959b82d462\" (UID: \"4ff12186-94b7-4301-af56-f6959b82d462\") " Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.800040 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs" (OuterVolumeSpecName: "logs") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.800081 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.800368 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.800388 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ff12186-94b7-4301-af56-f6959b82d462-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.806032 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph" (OuterVolumeSpecName: "ceph") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.811709 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts" (OuterVolumeSpecName: "scripts") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.822166 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8" (OuterVolumeSpecName: "kube-api-access-7f8w8") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "kube-api-access-7f8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.828936 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.859425 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.892338 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data" (OuterVolumeSpecName: "config-data") pod "4ff12186-94b7-4301-af56-f6959b82d462" (UID: "4ff12186-94b7-4301-af56-f6959b82d462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.901591 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.901624 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8w8\" (UniqueName: \"kubernetes.io/projected/4ff12186-94b7-4301-af56-f6959b82d462-kube-api-access-7f8w8\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.901636 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.901645 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:22 crc kubenswrapper[4939]: I0318 17:10:22.901654 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff12186-94b7-4301-af56-f6959b82d462-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.521054 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4ff12186-94b7-4301-af56-f6959b82d462","Type":"ContainerDied","Data":"bebb4c25d330ca613a695fbd5c21e830d3804d9c0d7dcd0034ac9085606b41b9"} Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.521416 4939 scope.go:117] "RemoveContainer" containerID="603817e142630d2851c5b5174dc5a61ec539fc4231700131a5079491288bbd72" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.521112 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-log" containerID="cri-o://171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" gracePeriod=30 Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.521173 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.521211 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-httpd" containerID="cri-o://8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" gracePeriod=30 Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.556769 4939 scope.go:117] "RemoveContainer" containerID="467b8d7600c33817045c72486f707ade5e572b7012fa358e6f5bc4d0bfbb5fe4" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.562620 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.572117 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.581980 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:23 crc kubenswrapper[4939]: E0318 17:10:23.582397 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-httpd" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.582416 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-httpd" Mar 18 17:10:23 crc kubenswrapper[4939]: E0318 17:10:23.582440 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-log" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.582448 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-log" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.582684 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-httpd" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.582705 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff12186-94b7-4301-af56-f6959b82d462" containerName="glance-log" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.583827 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.588033 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.598152 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.750720 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.751497 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.751626 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.751766 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.751875 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.752057 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mg8\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.752232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.854848 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855156 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855223 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855262 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855280 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64mg8\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855318 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855365 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855715 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.855755 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.859343 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.859590 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.860151 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.860368 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.877018 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64mg8\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8\") pod \"glance-default-external-api-0\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " pod="openstack/glance-default-external-api-0" Mar 18 17:10:23 crc kubenswrapper[4939]: I0318 17:10:23.968876 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.145722 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff12186-94b7-4301-af56-f6959b82d462" path="/var/lib/kubelet/pods/4ff12186-94b7-4301-af56-f6959b82d462/volumes" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.146196 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268320 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268451 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268522 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268561 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq5q5\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268592 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268634 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.268656 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph\") pod \"59eda7de-6661-418d-80ce-0607e133a218\" (UID: \"59eda7de-6661-418d-80ce-0607e133a218\") " Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.269125 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.269262 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.269461 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs" (OuterVolumeSpecName: "logs") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.274110 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts" (OuterVolumeSpecName: "scripts") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.274543 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5" (OuterVolumeSpecName: "kube-api-access-zq5q5") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "kube-api-access-zq5q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.275995 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph" (OuterVolumeSpecName: "ceph") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.302580 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.314224 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data" (OuterVolumeSpecName: "config-data") pod "59eda7de-6661-418d-80ce-0607e133a218" (UID: "59eda7de-6661-418d-80ce-0607e133a218"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370331 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370364 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370375 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59eda7de-6661-418d-80ce-0607e133a218-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370386 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq5q5\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-kube-api-access-zq5q5\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370394 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59eda7de-6661-418d-80ce-0607e133a218-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.370402 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/59eda7de-6661-418d-80ce-0607e133a218-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536358 4939 generic.go:334] "Generic (PLEG): container finished" podID="59eda7de-6661-418d-80ce-0607e133a218" containerID="8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" exitCode=0 Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536391 4939 generic.go:334] "Generic (PLEG): container finished" podID="59eda7de-6661-418d-80ce-0607e133a218" containerID="171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" exitCode=143 Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536408 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536413 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerDied","Data":"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9"} Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536528 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerDied","Data":"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583"} Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536544 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59eda7de-6661-418d-80ce-0607e133a218","Type":"ContainerDied","Data":"7560ee135e1da6a33cd94506f4c2991d841245cd864d3f35e1b0ed8e8b561127"} Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.536562 4939 scope.go:117] "RemoveContainer" containerID="8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.554051 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.575484 4939 scope.go:117] "RemoveContainer" containerID="171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.577623 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.588622 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.598921 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:24 crc kubenswrapper[4939]: E0318 17:10:24.599712 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-log" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.599810 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-log" Mar 18 17:10:24 crc kubenswrapper[4939]: E0318 17:10:24.599890 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-httpd" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.599996 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-httpd" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.600275 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-httpd" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.600367 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="59eda7de-6661-418d-80ce-0607e133a218" containerName="glance-log" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.601603 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.605004 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.615731 4939 scope.go:117] "RemoveContainer" containerID="8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.616903 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:24 crc kubenswrapper[4939]: E0318 17:10:24.621342 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9\": container with ID starting with 8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9 not found: ID does not exist" containerID="8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.621393 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9"} err="failed to get container status \"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9\": rpc error: code = NotFound desc = could not find container \"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9\": container with ID starting with 8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9 not found: ID does not exist" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.621423 4939 scope.go:117] "RemoveContainer" containerID="171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" Mar 18 17:10:24 crc kubenswrapper[4939]: E0318 17:10:24.622024 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583\": container with ID starting with 171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583 not found: ID does not exist" containerID="171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.622050 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583"} err="failed to get container status \"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583\": rpc error: code = NotFound desc = could not find container \"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583\": container with ID starting with 171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583 not found: ID does not exist" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.622066 4939 scope.go:117] "RemoveContainer" containerID="8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.623779 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9"} err="failed to get container status \"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9\": rpc error: code = NotFound desc = could not find container \"8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9\": container with ID starting with 8dfb80ae7716394eba69f9fb5b00ebdd186f7b77cc71143a48ea7b40fc8643a9 not found: ID does not exist" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.623823 4939 scope.go:117] "RemoveContainer" containerID="171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.624342 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583"} err="failed to get container status \"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583\": rpc error: code = NotFound desc = could not find container \"171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583\": container with ID starting with 171a36ccde01d97e9cb852ea78dc7a51de595511ac3ea39004e1f013813ac583 not found: ID does not exist" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.783949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784028 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9vh\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784178 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784583 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784651 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784680 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.784699 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886395 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886468 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886498 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886631 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9vh\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.886716 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.887485 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.891149 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.891440 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.894559 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.895354 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.896096 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.908339 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9vh\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh\") pod \"glance-default-internal-api-0\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:10:24 crc kubenswrapper[4939]: I0318 17:10:24.929320 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:25 crc kubenswrapper[4939]: I0318 17:10:25.442738 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:10:25 crc kubenswrapper[4939]: I0318 17:10:25.563404 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerStarted","Data":"f2a0c57dc83fbe57e495473d4d0c888f542eb05c66959752b736c4347585e1f6"} Mar 18 17:10:25 crc kubenswrapper[4939]: I0318 17:10:25.568007 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerStarted","Data":"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945"} Mar 18 17:10:25 crc kubenswrapper[4939]: I0318 17:10:25.568048 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerStarted","Data":"115de82933ed262f765cdd96cfec7cea4ff58b8ac2235705a22477c58167eab8"} Mar 18 17:10:26 crc kubenswrapper[4939]: I0318 17:10:26.157290 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59eda7de-6661-418d-80ce-0607e133a218" path="/var/lib/kubelet/pods/59eda7de-6661-418d-80ce-0607e133a218/volumes" Mar 18 17:10:26 crc kubenswrapper[4939]: I0318 17:10:26.577769 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerStarted","Data":"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375"} Mar 18 17:10:26 crc kubenswrapper[4939]: I0318 17:10:26.580350 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerStarted","Data":"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b"} Mar 18 17:10:26 crc kubenswrapper[4939]: I0318 17:10:26.605687 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.605666625 podStartE2EDuration="3.605666625s" podCreationTimestamp="2026-03-18 17:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:26.600829927 +0000 UTC m=+5591.200017558" watchObservedRunningTime="2026-03-18 17:10:26.605666625 +0000 UTC m=+5591.204854246" Mar 18 17:10:27 crc kubenswrapper[4939]: I0318 17:10:27.590660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerStarted","Data":"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618"} Mar 18 17:10:27 crc kubenswrapper[4939]: I0318 17:10:27.615720 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.615702078 podStartE2EDuration="3.615702078s" podCreationTimestamp="2026-03-18 17:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:27.60591605 +0000 UTC m=+5592.205103681" watchObservedRunningTime="2026-03-18 17:10:27.615702078 +0000 UTC m=+5592.214889699" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.173607 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.227852 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.228104 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="dnsmasq-dns" containerID="cri-o://7df96ab182e89ae8cd1ed7cb454b7c0711c60c788d05723b0dbbca95c321e19c" gracePeriod=10 Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.611999 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerID="7df96ab182e89ae8cd1ed7cb454b7c0711c60c788d05723b0dbbca95c321e19c" exitCode=0 Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.612176 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" event={"ID":"ca6647b2-4fde-4b54-8ca0-768948dee0be","Type":"ContainerDied","Data":"7df96ab182e89ae8cd1ed7cb454b7c0711c60c788d05723b0dbbca95c321e19c"} Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.694185 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.879142 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb\") pod \"ca6647b2-4fde-4b54-8ca0-768948dee0be\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.879180 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc\") pod \"ca6647b2-4fde-4b54-8ca0-768948dee0be\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.879325 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config\") pod \"ca6647b2-4fde-4b54-8ca0-768948dee0be\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.879390 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb\") pod \"ca6647b2-4fde-4b54-8ca0-768948dee0be\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.879409 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52rkt\" (UniqueName: \"kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt\") pod \"ca6647b2-4fde-4b54-8ca0-768948dee0be\" (UID: \"ca6647b2-4fde-4b54-8ca0-768948dee0be\") " Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.923729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt" (OuterVolumeSpecName: "kube-api-access-52rkt") pod "ca6647b2-4fde-4b54-8ca0-768948dee0be" (UID: "ca6647b2-4fde-4b54-8ca0-768948dee0be"). InnerVolumeSpecName "kube-api-access-52rkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.925117 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca6647b2-4fde-4b54-8ca0-768948dee0be" (UID: "ca6647b2-4fde-4b54-8ca0-768948dee0be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.926915 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca6647b2-4fde-4b54-8ca0-768948dee0be" (UID: "ca6647b2-4fde-4b54-8ca0-768948dee0be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.943027 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca6647b2-4fde-4b54-8ca0-768948dee0be" (UID: "ca6647b2-4fde-4b54-8ca0-768948dee0be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.946241 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config" (OuterVolumeSpecName: "config") pod "ca6647b2-4fde-4b54-8ca0-768948dee0be" (UID: "ca6647b2-4fde-4b54-8ca0-768948dee0be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.980806 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.981026 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.981152 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52rkt\" (UniqueName: \"kubernetes.io/projected/ca6647b2-4fde-4b54-8ca0-768948dee0be-kube-api-access-52rkt\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.981231 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:29 crc kubenswrapper[4939]: I0318 17:10:29.981339 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca6647b2-4fde-4b54-8ca0-768948dee0be-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.629179 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" event={"ID":"ca6647b2-4fde-4b54-8ca0-768948dee0be","Type":"ContainerDied","Data":"fe8aed7b3e38c59dc274d1c39afcc7391a9227535853375df8ba49eead317a4f"} Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.629241 4939 scope.go:117] "RemoveContainer" containerID="7df96ab182e89ae8cd1ed7cb454b7c0711c60c788d05723b0dbbca95c321e19c" Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.629313 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-nnhzk" Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.649569 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.656387 4939 scope.go:117] "RemoveContainer" containerID="82363eeeb9e4cb68352def813b5535d5bcfbcaa232ccd17f35126b0bb02055ba" Mar 18 17:10:30 crc kubenswrapper[4939]: I0318 17:10:30.657121 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-nnhzk"] Mar 18 17:10:32 crc kubenswrapper[4939]: I0318 17:10:32.145843 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" path="/var/lib/kubelet/pods/ca6647b2-4fde-4b54-8ca0-768948dee0be/volumes" Mar 18 17:10:33 crc kubenswrapper[4939]: I0318 17:10:33.970421 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 17:10:33 crc kubenswrapper[4939]: I0318 17:10:33.970753 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.023439 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.035908 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.670356 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.670396 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.929549 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.929594 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.960287 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:34 crc kubenswrapper[4939]: I0318 17:10:34.975738 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:35 crc kubenswrapper[4939]: I0318 17:10:35.678133 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:35 crc kubenswrapper[4939]: I0318 17:10:35.678210 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:36 crc kubenswrapper[4939]: I0318 17:10:36.680471 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 17:10:36 crc kubenswrapper[4939]: I0318 17:10:36.681975 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 17:10:37 crc kubenswrapper[4939]: I0318 17:10:37.657613 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:37 crc kubenswrapper[4939]: I0318 17:10:37.696748 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 17:10:37 crc kubenswrapper[4939]: I0318 17:10:37.707592 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.465208 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8fbbj"] Mar 18 17:10:45 crc kubenswrapper[4939]: E0318 17:10:45.466152 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="init" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.466170 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="init" Mar 18 17:10:45 crc kubenswrapper[4939]: E0318 17:10:45.466194 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="dnsmasq-dns" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.466202 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="dnsmasq-dns" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.466383 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6647b2-4fde-4b54-8ca0-768948dee0be" containerName="dnsmasq-dns" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.467089 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.478857 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fba0-account-create-update-824gc"] Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.480234 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.482943 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.488838 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8fbbj"] Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.499211 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fba0-account-create-update-824gc"] Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.575899 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzmm\" (UniqueName: \"kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.576421 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.576456 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zqb\" (UniqueName: \"kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.576706 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.678613 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glzmm\" (UniqueName: \"kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.678760 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.678806 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zqb\" (UniqueName: \"kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.678929 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.679521 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.680663 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.696202 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zqb\" (UniqueName: \"kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb\") pod \"placement-db-create-8fbbj\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.706245 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzmm\" (UniqueName: \"kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm\") pod \"placement-fba0-account-create-update-824gc\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.788193 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:45 crc kubenswrapper[4939]: I0318 17:10:45.797982 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.263555 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fba0-account-create-update-824gc"] Mar 18 17:10:46 crc kubenswrapper[4939]: W0318 17:10:46.265706 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde0e0d64_19f5_4bce_abb1_8b6344d04681.slice/crio-0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067 WatchSource:0}: Error finding container 0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067: Status 404 returned error can't find the container with id 0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067 Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.321325 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8fbbj"] Mar 18 17:10:46 crc kubenswrapper[4939]: W0318 17:10:46.330055 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod323a9023_e60c_4234_9691_3dc7cde07c64.slice/crio-39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b WatchSource:0}: Error finding container 39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b: Status 404 returned error can't find the container with id 39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.783071 4939 generic.go:334] "Generic (PLEG): container finished" podID="323a9023-e60c-4234-9691-3dc7cde07c64" containerID="0551dd988eddc46237cad1fb9190841d60322a8305b60824485b944f3830b0d8" exitCode=0 Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.783121 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8fbbj" event={"ID":"323a9023-e60c-4234-9691-3dc7cde07c64","Type":"ContainerDied","Data":"0551dd988eddc46237cad1fb9190841d60322a8305b60824485b944f3830b0d8"} Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.783351 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8fbbj" event={"ID":"323a9023-e60c-4234-9691-3dc7cde07c64","Type":"ContainerStarted","Data":"39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b"} Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.785203 4939 generic.go:334] "Generic (PLEG): container finished" podID="de0e0d64-19f5-4bce-abb1-8b6344d04681" containerID="b22fa23240c312f5f2d23a5baf604492b3cab3b9ddfaa45e3045b84b3d8ae534" exitCode=0 Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.785240 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fba0-account-create-update-824gc" event={"ID":"de0e0d64-19f5-4bce-abb1-8b6344d04681","Type":"ContainerDied","Data":"b22fa23240c312f5f2d23a5baf604492b3cab3b9ddfaa45e3045b84b3d8ae534"} Mar 18 17:10:46 crc kubenswrapper[4939]: I0318 17:10:46.785262 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fba0-account-create-update-824gc" event={"ID":"de0e0d64-19f5-4bce-abb1-8b6344d04681","Type":"ContainerStarted","Data":"0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067"} Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.294890 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.303552 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.424856 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts\") pod \"323a9023-e60c-4234-9691-3dc7cde07c64\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.424896 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzmm\" (UniqueName: \"kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm\") pod \"de0e0d64-19f5-4bce-abb1-8b6344d04681\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.425006 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts\") pod \"de0e0d64-19f5-4bce-abb1-8b6344d04681\" (UID: \"de0e0d64-19f5-4bce-abb1-8b6344d04681\") " Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.425100 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zqb\" (UniqueName: \"kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb\") pod \"323a9023-e60c-4234-9691-3dc7cde07c64\" (UID: \"323a9023-e60c-4234-9691-3dc7cde07c64\") " Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.425999 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "323a9023-e60c-4234-9691-3dc7cde07c64" (UID: "323a9023-e60c-4234-9691-3dc7cde07c64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.426064 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de0e0d64-19f5-4bce-abb1-8b6344d04681" (UID: "de0e0d64-19f5-4bce-abb1-8b6344d04681"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.430762 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm" (OuterVolumeSpecName: "kube-api-access-glzmm") pod "de0e0d64-19f5-4bce-abb1-8b6344d04681" (UID: "de0e0d64-19f5-4bce-abb1-8b6344d04681"). InnerVolumeSpecName "kube-api-access-glzmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.431192 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb" (OuterVolumeSpecName: "kube-api-access-s7zqb") pod "323a9023-e60c-4234-9691-3dc7cde07c64" (UID: "323a9023-e60c-4234-9691-3dc7cde07c64"). InnerVolumeSpecName "kube-api-access-s7zqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.528110 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323a9023-e60c-4234-9691-3dc7cde07c64-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.528162 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glzmm\" (UniqueName: \"kubernetes.io/projected/de0e0d64-19f5-4bce-abb1-8b6344d04681-kube-api-access-glzmm\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.528181 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de0e0d64-19f5-4bce-abb1-8b6344d04681-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.528195 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zqb\" (UniqueName: \"kubernetes.io/projected/323a9023-e60c-4234-9691-3dc7cde07c64-kube-api-access-s7zqb\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.805121 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fba0-account-create-update-824gc" event={"ID":"de0e0d64-19f5-4bce-abb1-8b6344d04681","Type":"ContainerDied","Data":"0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067"} Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.805429 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abb87403930f01e29bd40e1078538a0bc24cbaf5c11fe608ba7f609c2619067" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.805165 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fba0-account-create-update-824gc" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.807696 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8fbbj" event={"ID":"323a9023-e60c-4234-9691-3dc7cde07c64","Type":"ContainerDied","Data":"39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b"} Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.807724 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e390c1e168f83a4c4412b796defbc3296da767c36a373e35f7421e9c9e8c6b" Mar 18 17:10:48 crc kubenswrapper[4939]: I0318 17:10:48.807759 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8fbbj" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.838587 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:10:50 crc kubenswrapper[4939]: E0318 17:10:50.839343 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323a9023-e60c-4234-9691-3dc7cde07c64" containerName="mariadb-database-create" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.839365 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="323a9023-e60c-4234-9691-3dc7cde07c64" containerName="mariadb-database-create" Mar 18 17:10:50 crc kubenswrapper[4939]: E0318 17:10:50.839382 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0e0d64-19f5-4bce-abb1-8b6344d04681" containerName="mariadb-account-create-update" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.839392 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0e0d64-19f5-4bce-abb1-8b6344d04681" containerName="mariadb-account-create-update" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.839617 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="323a9023-e60c-4234-9691-3dc7cde07c64" containerName="mariadb-database-create" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.839653 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="de0e0d64-19f5-4bce-abb1-8b6344d04681" containerName="mariadb-account-create-update" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.840758 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.852031 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-znt4w"] Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.853455 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.855161 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.855797 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7gfxv" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.857843 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.865738 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.875170 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-znt4w"] Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976067 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976265 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976308 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976356 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976471 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976696 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsjj\" (UniqueName: \"kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976857 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7ndx\" (UniqueName: \"kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.976960 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.977171 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:50 crc kubenswrapper[4939]: I0318 17:10:50.977232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079439 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079540 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsjj\" (UniqueName: \"kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079575 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7ndx\" (UniqueName: \"kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079600 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079653 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079674 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079722 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079751 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079769 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.079789 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.080578 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.080665 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.081074 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.082150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.082774 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.086028 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.087957 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.088458 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.100068 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsjj\" (UniqueName: \"kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj\") pod \"placement-db-sync-znt4w\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.101125 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7ndx\" (UniqueName: \"kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx\") pod \"dnsmasq-dns-6977c95bf9-wlhm9\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.175231 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.192576 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.696884 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.840565 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-znt4w"] Mar 18 17:10:51 crc kubenswrapper[4939]: I0318 17:10:51.856186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" event={"ID":"da8231f7-455b-4a7f-98bb-2cde842a5332","Type":"ContainerStarted","Data":"abc8edacabd61c8aa5720df5426eb0c7da33e8307bd14d71f0f7e08cc410362b"} Mar 18 17:10:52 crc kubenswrapper[4939]: I0318 17:10:52.864057 4939 generic.go:334] "Generic (PLEG): container finished" podID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerID="309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41" exitCode=0 Mar 18 17:10:52 crc kubenswrapper[4939]: I0318 17:10:52.864165 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" event={"ID":"da8231f7-455b-4a7f-98bb-2cde842a5332","Type":"ContainerDied","Data":"309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41"} Mar 18 17:10:52 crc kubenswrapper[4939]: I0318 17:10:52.865854 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-znt4w" event={"ID":"c1852a30-ade7-4a7a-a96c-a8098b875b30","Type":"ContainerStarted","Data":"8ba491de89683c345bb160686ba7f82be7d6682d7cbbde6b4a07d7ca73dfa37e"} Mar 18 17:10:52 crc kubenswrapper[4939]: I0318 17:10:52.865902 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-znt4w" event={"ID":"c1852a30-ade7-4a7a-a96c-a8098b875b30","Type":"ContainerStarted","Data":"d295da2cf25d2b598cf0a3ad5d19a878f385e9625c3f966f0c6d10be8119e57f"} Mar 18 17:10:52 crc kubenswrapper[4939]: I0318 17:10:52.903381 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-znt4w" podStartSLOduration=2.9033433669999997 podStartE2EDuration="2.903343367s" podCreationTimestamp="2026-03-18 17:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:52.902101822 +0000 UTC m=+5617.501289483" watchObservedRunningTime="2026-03-18 17:10:52.903343367 +0000 UTC m=+5617.502530988" Mar 18 17:10:53 crc kubenswrapper[4939]: I0318 17:10:53.877436 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" event={"ID":"da8231f7-455b-4a7f-98bb-2cde842a5332","Type":"ContainerStarted","Data":"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134"} Mar 18 17:10:53 crc kubenswrapper[4939]: I0318 17:10:53.877844 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:10:53 crc kubenswrapper[4939]: I0318 17:10:53.880422 4939 generic.go:334] "Generic (PLEG): container finished" podID="c1852a30-ade7-4a7a-a96c-a8098b875b30" containerID="8ba491de89683c345bb160686ba7f82be7d6682d7cbbde6b4a07d7ca73dfa37e" exitCode=0 Mar 18 17:10:53 crc kubenswrapper[4939]: I0318 17:10:53.880456 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-znt4w" event={"ID":"c1852a30-ade7-4a7a-a96c-a8098b875b30","Type":"ContainerDied","Data":"8ba491de89683c345bb160686ba7f82be7d6682d7cbbde6b4a07d7ca73dfa37e"} Mar 18 17:10:53 crc kubenswrapper[4939]: I0318 17:10:53.911920 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" podStartSLOduration=3.911889137 podStartE2EDuration="3.911889137s" podCreationTimestamp="2026-03-18 17:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:53.902364147 +0000 UTC m=+5618.501551768" watchObservedRunningTime="2026-03-18 17:10:53.911889137 +0000 UTC m=+5618.511076788" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.327531 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.384948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs\") pod \"c1852a30-ade7-4a7a-a96c-a8098b875b30\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.385027 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle\") pod \"c1852a30-ade7-4a7a-a96c-a8098b875b30\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.385095 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsjj\" (UniqueName: \"kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj\") pod \"c1852a30-ade7-4a7a-a96c-a8098b875b30\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.385181 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data\") pod \"c1852a30-ade7-4a7a-a96c-a8098b875b30\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.385303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts\") pod \"c1852a30-ade7-4a7a-a96c-a8098b875b30\" (UID: \"c1852a30-ade7-4a7a-a96c-a8098b875b30\") " Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.388969 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs" (OuterVolumeSpecName: "logs") pod "c1852a30-ade7-4a7a-a96c-a8098b875b30" (UID: "c1852a30-ade7-4a7a-a96c-a8098b875b30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.396682 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1852a30-ade7-4a7a-a96c-a8098b875b30-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.440172 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj" (OuterVolumeSpecName: "kube-api-access-kdsjj") pod "c1852a30-ade7-4a7a-a96c-a8098b875b30" (UID: "c1852a30-ade7-4a7a-a96c-a8098b875b30"). InnerVolumeSpecName "kube-api-access-kdsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.443556 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts" (OuterVolumeSpecName: "scripts") pod "c1852a30-ade7-4a7a-a96c-a8098b875b30" (UID: "c1852a30-ade7-4a7a-a96c-a8098b875b30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.449022 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1852a30-ade7-4a7a-a96c-a8098b875b30" (UID: "c1852a30-ade7-4a7a-a96c-a8098b875b30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.458694 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data" (OuterVolumeSpecName: "config-data") pod "c1852a30-ade7-4a7a-a96c-a8098b875b30" (UID: "c1852a30-ade7-4a7a-a96c-a8098b875b30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.459407 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:10:55 crc kubenswrapper[4939]: E0318 17:10:55.460930 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1852a30-ade7-4a7a-a96c-a8098b875b30" containerName="placement-db-sync" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.460958 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1852a30-ade7-4a7a-a96c-a8098b875b30" containerName="placement-db-sync" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.461417 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1852a30-ade7-4a7a-a96c-a8098b875b30" containerName="placement-db-sync" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.464235 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.486024 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.500493 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.500613 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsjj\" (UniqueName: \"kubernetes.io/projected/c1852a30-ade7-4a7a-a96c-a8098b875b30-kube-api-access-kdsjj\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.500630 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.500642 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1852a30-ade7-4a7a-a96c-a8098b875b30-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.601993 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.602056 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.602078 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmkp6\" (UniqueName: \"kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.704309 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.704379 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.704416 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmkp6\" (UniqueName: \"kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.705085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.705158 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.745733 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmkp6\" (UniqueName: \"kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6\") pod \"redhat-operators-2r4wm\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.819388 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.898645 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-znt4w" event={"ID":"c1852a30-ade7-4a7a-a96c-a8098b875b30","Type":"ContainerDied","Data":"d295da2cf25d2b598cf0a3ad5d19a878f385e9625c3f966f0c6d10be8119e57f"} Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.898979 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d295da2cf25d2b598cf0a3ad5d19a878f385e9625c3f966f0c6d10be8119e57f" Mar 18 17:10:55 crc kubenswrapper[4939]: I0318 17:10:55.898861 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-znt4w" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.011495 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c5dbfc5cb-x2dfn"] Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.012836 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.021158 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7gfxv" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.021323 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.021429 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.026636 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c5dbfc5cb-x2dfn"] Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.113148 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-logs\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.113238 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-combined-ca-bundle\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.113283 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84tw\" (UniqueName: \"kubernetes.io/projected/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-kube-api-access-v84tw\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.113335 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-scripts\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.113356 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-config-data\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.214490 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84tw\" (UniqueName: \"kubernetes.io/projected/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-kube-api-access-v84tw\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.214848 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-scripts\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.214877 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-config-data\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.214924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-logs\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.214994 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-combined-ca-bundle\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.215464 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-logs\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.219603 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-scripts\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.220579 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-combined-ca-bundle\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.220806 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-config-data\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.232198 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84tw\" (UniqueName: \"kubernetes.io/projected/fac1b0c6-6b39-4db2-9a6c-2429e3e17e52-kube-api-access-v84tw\") pod \"placement-5c5dbfc5cb-x2dfn\" (UID: \"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52\") " pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.349207 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.349595 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.844365 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c5dbfc5cb-x2dfn"] Mar 18 17:10:56 crc kubenswrapper[4939]: W0318 17:10:56.856071 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfac1b0c6_6b39_4db2_9a6c_2429e3e17e52.slice/crio-a6122d981a3acbfe9c18a84ecd76caebd7b0c2ab9c879da62639d09f10cdbca0 WatchSource:0}: Error finding container a6122d981a3acbfe9c18a84ecd76caebd7b0c2ab9c879da62639d09f10cdbca0: Status 404 returned error can't find the container with id a6122d981a3acbfe9c18a84ecd76caebd7b0c2ab9c879da62639d09f10cdbca0 Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.909664 4939 generic.go:334] "Generic (PLEG): container finished" podID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerID="18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7" exitCode=0 Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.909710 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerDied","Data":"18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7"} Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.909759 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerStarted","Data":"7f8bc865afc442452d588fc68c5c19319f21f921280cfd383f177aeffabd8367"} Mar 18 17:10:56 crc kubenswrapper[4939]: I0318 17:10:56.913315 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c5dbfc5cb-x2dfn" event={"ID":"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52","Type":"ContainerStarted","Data":"a6122d981a3acbfe9c18a84ecd76caebd7b0c2ab9c879da62639d09f10cdbca0"} Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.923619 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerStarted","Data":"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2"} Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.925871 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c5dbfc5cb-x2dfn" event={"ID":"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52","Type":"ContainerStarted","Data":"5c8b7de480d14610ec80ca81b0ca50bf7d19426b29f2ed9707cab5aad1face2c"} Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.925903 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c5dbfc5cb-x2dfn" event={"ID":"fac1b0c6-6b39-4db2-9a6c-2429e3e17e52","Type":"ContainerStarted","Data":"8eac07204c1f2ce35414fc0837f35a3ea5f0c65768979b3bfea219d8fb7f6138"} Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.926413 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.926449 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:10:57 crc kubenswrapper[4939]: I0318 17:10:57.962029 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5c5dbfc5cb-x2dfn" podStartSLOduration=2.962004201 podStartE2EDuration="2.962004201s" podCreationTimestamp="2026-03-18 17:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:57.959285494 +0000 UTC m=+5622.558473115" watchObservedRunningTime="2026-03-18 17:10:57.962004201 +0000 UTC m=+5622.561191812" Mar 18 17:10:58 crc kubenswrapper[4939]: I0318 17:10:58.936850 4939 generic.go:334] "Generic (PLEG): container finished" podID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerID="42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2" exitCode=0 Mar 18 17:10:58 crc kubenswrapper[4939]: I0318 17:10:58.936929 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerDied","Data":"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2"} Mar 18 17:10:59 crc kubenswrapper[4939]: I0318 17:10:59.946575 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerStarted","Data":"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94"} Mar 18 17:10:59 crc kubenswrapper[4939]: I0318 17:10:59.965659 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2r4wm" podStartSLOduration=2.319480138 podStartE2EDuration="4.96563799s" podCreationTimestamp="2026-03-18 17:10:55 +0000 UTC" firstStartedPulling="2026-03-18 17:10:56.913434785 +0000 UTC m=+5621.512622396" lastFinishedPulling="2026-03-18 17:10:59.559592637 +0000 UTC m=+5624.158780248" observedRunningTime="2026-03-18 17:10:59.961590155 +0000 UTC m=+5624.560777776" watchObservedRunningTime="2026-03-18 17:10:59.96563799 +0000 UTC m=+5624.564825611" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.036422 4939 scope.go:117] "RemoveContainer" containerID="59c94bfa1ffe712f4d5ea350da6ddad9bcf5b6086d2564361780aed58f93bac0" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.053584 4939 scope.go:117] "RemoveContainer" containerID="283b523446ee5bfe06bc81b94086b4e5cf135e2a7a17d84c2afcad944d512453" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.095426 4939 scope.go:117] "RemoveContainer" containerID="d4a058bea7da8f9cba51cbbc5120786772d44ff98761baa7ce5ff1c31fc83830" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.157534 4939 scope.go:117] "RemoveContainer" containerID="46c790e2cab895c66874bedb067cba29b1d3ddc88748a4dcc5ff1d0bda7a98f6" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.176720 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.243197 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.243779 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="dnsmasq-dns" containerID="cri-o://f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1" gracePeriod=10 Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.678188 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.712020 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb\") pod \"9193eb4a-4546-4c27-a575-05fd5bddacd9\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.712071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb\") pod \"9193eb4a-4546-4c27-a575-05fd5bddacd9\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.712148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc\") pod \"9193eb4a-4546-4c27-a575-05fd5bddacd9\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.712208 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config\") pod \"9193eb4a-4546-4c27-a575-05fd5bddacd9\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.712326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4zk\" (UniqueName: \"kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk\") pod \"9193eb4a-4546-4c27-a575-05fd5bddacd9\" (UID: \"9193eb4a-4546-4c27-a575-05fd5bddacd9\") " Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.725230 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk" (OuterVolumeSpecName: "kube-api-access-dx4zk") pod "9193eb4a-4546-4c27-a575-05fd5bddacd9" (UID: "9193eb4a-4546-4c27-a575-05fd5bddacd9"). InnerVolumeSpecName "kube-api-access-dx4zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.760077 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config" (OuterVolumeSpecName: "config") pod "9193eb4a-4546-4c27-a575-05fd5bddacd9" (UID: "9193eb4a-4546-4c27-a575-05fd5bddacd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.762520 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9193eb4a-4546-4c27-a575-05fd5bddacd9" (UID: "9193eb4a-4546-4c27-a575-05fd5bddacd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.768243 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9193eb4a-4546-4c27-a575-05fd5bddacd9" (UID: "9193eb4a-4546-4c27-a575-05fd5bddacd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.768882 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9193eb4a-4546-4c27-a575-05fd5bddacd9" (UID: "9193eb4a-4546-4c27-a575-05fd5bddacd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.814027 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4zk\" (UniqueName: \"kubernetes.io/projected/9193eb4a-4546-4c27-a575-05fd5bddacd9-kube-api-access-dx4zk\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.814059 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.814069 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.814078 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.814088 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9193eb4a-4546-4c27-a575-05fd5bddacd9-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.963350 4939 generic.go:334] "Generic (PLEG): container finished" podID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerID="f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1" exitCode=0 Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.963433 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" event={"ID":"9193eb4a-4546-4c27-a575-05fd5bddacd9","Type":"ContainerDied","Data":"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1"} Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.963443 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.963467 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx" event={"ID":"9193eb4a-4546-4c27-a575-05fd5bddacd9","Type":"ContainerDied","Data":"45d914f55ed177f7e0359a1a027ef5b07da276a2cd461b6fc229271f211856af"} Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.963497 4939 scope.go:117] "RemoveContainer" containerID="f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.991064 4939 scope.go:117] "RemoveContainer" containerID="aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9" Mar 18 17:11:01 crc kubenswrapper[4939]: I0318 17:11:01.999784 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.007806 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-ljhbx"] Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.028338 4939 scope.go:117] "RemoveContainer" containerID="f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1" Mar 18 17:11:02 crc kubenswrapper[4939]: E0318 17:11:02.028788 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1\": container with ID starting with f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1 not found: ID does not exist" containerID="f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1" Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.028830 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1"} err="failed to get container status \"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1\": rpc error: code = NotFound desc = could not find container \"f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1\": container with ID starting with f5172045feebd96e2c848637b9a6a92c9766386a88bd50e92b0559bbaf01f1b1 not found: ID does not exist" Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.028855 4939 scope.go:117] "RemoveContainer" containerID="aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9" Mar 18 17:11:02 crc kubenswrapper[4939]: E0318 17:11:02.029143 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9\": container with ID starting with aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9 not found: ID does not exist" containerID="aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9" Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.029186 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9"} err="failed to get container status \"aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9\": rpc error: code = NotFound desc = could not find container \"aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9\": container with ID starting with aa4931617264f5b20f08e039154433cdadeebcdbb953c3651ce4e42707d7bed9 not found: ID does not exist" Mar 18 17:11:02 crc kubenswrapper[4939]: I0318 17:11:02.144061 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" path="/var/lib/kubelet/pods/9193eb4a-4546-4c27-a575-05fd5bddacd9/volumes" Mar 18 17:11:05 crc kubenswrapper[4939]: I0318 17:11:05.820375 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:05 crc kubenswrapper[4939]: I0318 17:11:05.820781 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:06 crc kubenswrapper[4939]: I0318 17:11:06.867148 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2r4wm" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="registry-server" probeResult="failure" output=< Mar 18 17:11:06 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:11:06 crc kubenswrapper[4939]: > Mar 18 17:11:15 crc kubenswrapper[4939]: I0318 17:11:15.867828 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:15 crc kubenswrapper[4939]: I0318 17:11:15.922152 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:16 crc kubenswrapper[4939]: I0318 17:11:16.113088 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.105595 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2r4wm" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="registry-server" containerID="cri-o://b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94" gracePeriod=2 Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.672301 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.796934 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmkp6\" (UniqueName: \"kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6\") pod \"496b72d9-1f33-4d5e-a2df-afc1716da61c\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.797254 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content\") pod \"496b72d9-1f33-4d5e-a2df-afc1716da61c\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.797413 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities\") pod \"496b72d9-1f33-4d5e-a2df-afc1716da61c\" (UID: \"496b72d9-1f33-4d5e-a2df-afc1716da61c\") " Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.798463 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities" (OuterVolumeSpecName: "utilities") pod "496b72d9-1f33-4d5e-a2df-afc1716da61c" (UID: "496b72d9-1f33-4d5e-a2df-afc1716da61c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.809789 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6" (OuterVolumeSpecName: "kube-api-access-gmkp6") pod "496b72d9-1f33-4d5e-a2df-afc1716da61c" (UID: "496b72d9-1f33-4d5e-a2df-afc1716da61c"). InnerVolumeSpecName "kube-api-access-gmkp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.899701 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.899742 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmkp6\" (UniqueName: \"kubernetes.io/projected/496b72d9-1f33-4d5e-a2df-afc1716da61c-kube-api-access-gmkp6\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:17 crc kubenswrapper[4939]: I0318 17:11:17.956348 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "496b72d9-1f33-4d5e-a2df-afc1716da61c" (UID: "496b72d9-1f33-4d5e-a2df-afc1716da61c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.003102 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496b72d9-1f33-4d5e-a2df-afc1716da61c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.117861 4939 generic.go:334] "Generic (PLEG): container finished" podID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerID="b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94" exitCode=0 Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.117908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerDied","Data":"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94"} Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.117937 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r4wm" event={"ID":"496b72d9-1f33-4d5e-a2df-afc1716da61c","Type":"ContainerDied","Data":"7f8bc865afc442452d588fc68c5c19319f21f921280cfd383f177aeffabd8367"} Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.117957 4939 scope.go:117] "RemoveContainer" containerID="b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.118089 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r4wm" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.148548 4939 scope.go:117] "RemoveContainer" containerID="42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.175393 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.183411 4939 scope.go:117] "RemoveContainer" containerID="18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.185694 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2r4wm"] Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.223284 4939 scope.go:117] "RemoveContainer" containerID="b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94" Mar 18 17:11:18 crc kubenswrapper[4939]: E0318 17:11:18.224012 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94\": container with ID starting with b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94 not found: ID does not exist" containerID="b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.224075 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94"} err="failed to get container status \"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94\": rpc error: code = NotFound desc = could not find container \"b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94\": container with ID starting with b8cdc6362bfe7fac43eee1dbe927da4de67bcfff9e63ce1421c10507c5c00b94 not found: ID does not exist" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.224112 4939 scope.go:117] "RemoveContainer" containerID="42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2" Mar 18 17:11:18 crc kubenswrapper[4939]: E0318 17:11:18.224605 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2\": container with ID starting with 42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2 not found: ID does not exist" containerID="42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.224640 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2"} err="failed to get container status \"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2\": rpc error: code = NotFound desc = could not find container \"42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2\": container with ID starting with 42bc52e84edffc389122ef9193b4441be05b04ea766053c3f5795f761a9d86c2 not found: ID does not exist" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.224661 4939 scope.go:117] "RemoveContainer" containerID="18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7" Mar 18 17:11:18 crc kubenswrapper[4939]: E0318 17:11:18.225009 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7\": container with ID starting with 18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7 not found: ID does not exist" containerID="18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7" Mar 18 17:11:18 crc kubenswrapper[4939]: I0318 17:11:18.225035 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7"} err="failed to get container status \"18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7\": rpc error: code = NotFound desc = could not find container \"18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7\": container with ID starting with 18e708776e78faf94c93cbf857a9fc77208cce9806c00b57d23c187b4b5812c7 not found: ID does not exist" Mar 18 17:11:20 crc kubenswrapper[4939]: I0318 17:11:20.153758 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" path="/var/lib/kubelet/pods/496b72d9-1f33-4d5e-a2df-afc1716da61c/volumes" Mar 18 17:11:27 crc kubenswrapper[4939]: I0318 17:11:27.500995 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:11:27 crc kubenswrapper[4939]: I0318 17:11:27.510796 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c5dbfc5cb-x2dfn" Mar 18 17:11:41 crc kubenswrapper[4939]: E0318 17:11:41.025804 4939 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.227:43092->38.102.83.227:41597: write tcp 38.102.83.227:43092->38.102.83.227:41597: write: broken pipe Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.394900 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wgltd"] Mar 18 17:11:47 crc kubenswrapper[4939]: E0318 17:11:47.395759 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="dnsmasq-dns" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.395769 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="dnsmasq-dns" Mar 18 17:11:47 crc kubenswrapper[4939]: E0318 17:11:47.395784 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="extract-content" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.395790 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="extract-content" Mar 18 17:11:47 crc kubenswrapper[4939]: E0318 17:11:47.395802 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="registry-server" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.395808 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="registry-server" Mar 18 17:11:47 crc kubenswrapper[4939]: E0318 17:11:47.395829 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="init" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.395835 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="init" Mar 18 17:11:47 crc kubenswrapper[4939]: E0318 17:11:47.395844 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="extract-utilities" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.395849 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="extract-utilities" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.396017 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="496b72d9-1f33-4d5e-a2df-afc1716da61c" containerName="registry-server" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.396048 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9193eb4a-4546-4c27-a575-05fd5bddacd9" containerName="dnsmasq-dns" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.396612 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.406817 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wgltd"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.444704 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j574z\" (UniqueName: \"kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.444949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.547275 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.547413 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j574z\" (UniqueName: \"kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.548114 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.576374 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j574z\" (UniqueName: \"kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z\") pod \"nova-api-db-create-wgltd\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.621659 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6a0e-account-create-update-v88nc"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.623093 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.626553 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.632649 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9glh5"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.633971 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.640098 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9glh5"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.648705 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.648833 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gsc\" (UniqueName: \"kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.648879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.648936 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tpf\" (UniqueName: \"kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.649163 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6a0e-account-create-update-v88nc"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.750759 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gsc\" (UniqueName: \"kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.750852 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.750896 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tpf\" (UniqueName: \"kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.750940 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.751646 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.752325 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.772810 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.805089 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gsc\" (UniqueName: \"kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc\") pod \"nova-cell0-db-create-9glh5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.805208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tpf\" (UniqueName: \"kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf\") pod \"nova-api-6a0e-account-create-update-v88nc\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.824196 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-2gnt7"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.825330 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.853145 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2gnt7"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.872842 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ab44-account-create-update-p6cph"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.873862 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.881858 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.883550 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ab44-account-create-update-p6cph"] Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.954477 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.954970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qxf\" (UniqueName: \"kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.955963 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:47 crc kubenswrapper[4939]: I0318 17:11:47.970570 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.029606 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1773-account-create-update-7b2m6"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.031008 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.033253 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.040112 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1773-account-create-update-7b2m6"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.056547 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdp9\" (UniqueName: \"kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.056618 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.056685 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qxf\" (UniqueName: \"kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.056738 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.057389 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.076561 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qxf\" (UniqueName: \"kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf\") pod \"nova-cell1-db-create-2gnt7\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.158818 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdp9\" (UniqueName: \"kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.158864 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.158918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjh77\" (UniqueName: \"kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.159009 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.159805 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.178397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdp9\" (UniqueName: \"kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9\") pod \"nova-cell0-ab44-account-create-update-p6cph\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.259014 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.260612 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.260737 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjh77\" (UniqueName: \"kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.262201 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.279270 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.280049 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjh77\" (UniqueName: \"kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77\") pod \"nova-cell1-1773-account-create-update-7b2m6\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.307325 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wgltd"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.356358 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.450195 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wgltd" event={"ID":"dd432bf0-b938-41fe-93d2-58613baa56aa","Type":"ContainerStarted","Data":"e88e018dabed166b109ec84516c57fef0f1476f3eb935e3e7aa8e961ca7d3295"} Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.489376 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6a0e-account-create-update-v88nc"] Mar 18 17:11:48 crc kubenswrapper[4939]: W0318 17:11:48.497647 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38a95dc7_b051_4cb1_986f_7f85876e0168.slice/crio-b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2 WatchSource:0}: Error finding container b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2: Status 404 returned error can't find the container with id b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2 Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.531301 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9glh5"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.540095 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2gnt7"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.832105 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ab44-account-create-update-p6cph"] Mar 18 17:11:48 crc kubenswrapper[4939]: I0318 17:11:48.989311 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1773-account-create-update-7b2m6"] Mar 18 17:11:48 crc kubenswrapper[4939]: W0318 17:11:48.993796 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d3b859_107b_4311_bb6a_74236113699d.slice/crio-b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62 WatchSource:0}: Error finding container b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62: Status 404 returned error can't find the container with id b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.459869 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" event={"ID":"b8d3b859-107b-4311-bb6a-74236113699d","Type":"ContainerStarted","Data":"4d988e6cc5115d69e8c98ce8898c4469c663c841f82cf558fbe09b53ccdff6d3"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.460224 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" event={"ID":"b8d3b859-107b-4311-bb6a-74236113699d","Type":"ContainerStarted","Data":"b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.466168 4939 generic.go:334] "Generic (PLEG): container finished" podID="97ef70d1-94b3-4941-adda-70d5d62b25a5" containerID="df8c8c9e8368314f778878c496e6a0e3fe614565bab2ab15edad28da6bc457f2" exitCode=0 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.466347 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9glh5" event={"ID":"97ef70d1-94b3-4941-adda-70d5d62b25a5","Type":"ContainerDied","Data":"df8c8c9e8368314f778878c496e6a0e3fe614565bab2ab15edad28da6bc457f2"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.466415 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9glh5" event={"ID":"97ef70d1-94b3-4941-adda-70d5d62b25a5","Type":"ContainerStarted","Data":"b4ca7b2b0e752be299727f5d8e734d6d1e636bf909401f40cf67c853b85715b0"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.469925 4939 generic.go:334] "Generic (PLEG): container finished" podID="38a95dc7-b051-4cb1-986f-7f85876e0168" containerID="79251cf53ec5a26cb34df02d6aaeda87a39ae4ba6a8e28b9654b2674a34bda64" exitCode=0 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.469991 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6a0e-account-create-update-v88nc" event={"ID":"38a95dc7-b051-4cb1-986f-7f85876e0168","Type":"ContainerDied","Data":"79251cf53ec5a26cb34df02d6aaeda87a39ae4ba6a8e28b9654b2674a34bda64"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.470021 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6a0e-account-create-update-v88nc" event={"ID":"38a95dc7-b051-4cb1-986f-7f85876e0168","Type":"ContainerStarted","Data":"b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.475258 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" podStartSLOduration=2.475235707 podStartE2EDuration="2.475235707s" podCreationTimestamp="2026-03-18 17:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:11:49.472603292 +0000 UTC m=+5674.071790933" watchObservedRunningTime="2026-03-18 17:11:49.475235707 +0000 UTC m=+5674.074423348" Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.476177 4939 generic.go:334] "Generic (PLEG): container finished" podID="11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" containerID="294e5fd4db2158939becc2e22eefd5648a11a2c5e3ffbf936c2c399728fe1742" exitCode=0 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.476262 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" event={"ID":"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb","Type":"ContainerDied","Data":"294e5fd4db2158939becc2e22eefd5648a11a2c5e3ffbf936c2c399728fe1742"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.476295 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" event={"ID":"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb","Type":"ContainerStarted","Data":"2033324ede36fbd2afceab7717045af24abb67ddbb1ffb7685731afbd01a37a2"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.478307 4939 generic.go:334] "Generic (PLEG): container finished" podID="dd432bf0-b938-41fe-93d2-58613baa56aa" containerID="3461ac15b77094dc51b83b42b0177e246bfa10279c9f5b9fa849d77057122716" exitCode=0 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.478535 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wgltd" event={"ID":"dd432bf0-b938-41fe-93d2-58613baa56aa","Type":"ContainerDied","Data":"3461ac15b77094dc51b83b42b0177e246bfa10279c9f5b9fa849d77057122716"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.481087 4939 generic.go:334] "Generic (PLEG): container finished" podID="91381de6-d500-4d85-b3de-4665b1b2e023" containerID="6d3abacc1c70322bfc4090aac1971eb71c8c5ed082d3c8fd1d66a345225707f4" exitCode=0 Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.481126 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2gnt7" event={"ID":"91381de6-d500-4d85-b3de-4665b1b2e023","Type":"ContainerDied","Data":"6d3abacc1c70322bfc4090aac1971eb71c8c5ed082d3c8fd1d66a345225707f4"} Mar 18 17:11:49 crc kubenswrapper[4939]: I0318 17:11:49.481147 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2gnt7" event={"ID":"91381de6-d500-4d85-b3de-4665b1b2e023","Type":"ContainerStarted","Data":"4fec1f126fbe00366e1ab32e8e9ca6edfdaa04d83b726f5eadf0a2793b08398d"} Mar 18 17:11:50 crc kubenswrapper[4939]: I0318 17:11:50.499555 4939 generic.go:334] "Generic (PLEG): container finished" podID="b8d3b859-107b-4311-bb6a-74236113699d" containerID="4d988e6cc5115d69e8c98ce8898c4469c663c841f82cf558fbe09b53ccdff6d3" exitCode=0 Mar 18 17:11:50 crc kubenswrapper[4939]: I0318 17:11:50.499767 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" event={"ID":"b8d3b859-107b-4311-bb6a-74236113699d","Type":"ContainerDied","Data":"4d988e6cc5115d69e8c98ce8898c4469c663c841f82cf558fbe09b53ccdff6d3"} Mar 18 17:11:50 crc kubenswrapper[4939]: I0318 17:11:50.891025 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.032302 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts\") pod \"dd432bf0-b938-41fe-93d2-58613baa56aa\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.032368 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j574z\" (UniqueName: \"kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z\") pod \"dd432bf0-b938-41fe-93d2-58613baa56aa\" (UID: \"dd432bf0-b938-41fe-93d2-58613baa56aa\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.034098 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd432bf0-b938-41fe-93d2-58613baa56aa" (UID: "dd432bf0-b938-41fe-93d2-58613baa56aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.040299 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z" (OuterVolumeSpecName: "kube-api-access-j574z") pod "dd432bf0-b938-41fe-93d2-58613baa56aa" (UID: "dd432bf0-b938-41fe-93d2-58613baa56aa"). InnerVolumeSpecName "kube-api-access-j574z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.087334 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.096380 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.112809 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.121627 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.137736 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd432bf0-b938-41fe-93d2-58613baa56aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.137776 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j574z\" (UniqueName: \"kubernetes.io/projected/dd432bf0-b938-41fe-93d2-58613baa56aa-kube-api-access-j574z\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239048 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts\") pod \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239217 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5qxf\" (UniqueName: \"kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf\") pod \"91381de6-d500-4d85-b3de-4665b1b2e023\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239377 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdp9\" (UniqueName: \"kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9\") pod \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\" (UID: \"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239416 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tpf\" (UniqueName: \"kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf\") pod \"38a95dc7-b051-4cb1-986f-7f85876e0168\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239462 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts\") pod \"38a95dc7-b051-4cb1-986f-7f85876e0168\" (UID: \"38a95dc7-b051-4cb1-986f-7f85876e0168\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239557 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8gsc\" (UniqueName: \"kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc\") pod \"97ef70d1-94b3-4941-adda-70d5d62b25a5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239593 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts\") pod \"97ef70d1-94b3-4941-adda-70d5d62b25a5\" (UID: \"97ef70d1-94b3-4941-adda-70d5d62b25a5\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239657 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts\") pod \"91381de6-d500-4d85-b3de-4665b1b2e023\" (UID: \"91381de6-d500-4d85-b3de-4665b1b2e023\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.239976 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" (UID: "11ab6999-cfdd-4db2-8c1b-73f3e29d38eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.240341 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97ef70d1-94b3-4941-adda-70d5d62b25a5" (UID: "97ef70d1-94b3-4941-adda-70d5d62b25a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.240845 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91381de6-d500-4d85-b3de-4665b1b2e023" (UID: "91381de6-d500-4d85-b3de-4665b1b2e023"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.241747 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ef70d1-94b3-4941-adda-70d5d62b25a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.241794 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91381de6-d500-4d85-b3de-4665b1b2e023-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.241818 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.242720 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38a95dc7-b051-4cb1-986f-7f85876e0168" (UID: "38a95dc7-b051-4cb1-986f-7f85876e0168"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.244036 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf" (OuterVolumeSpecName: "kube-api-access-v5qxf") pod "91381de6-d500-4d85-b3de-4665b1b2e023" (UID: "91381de6-d500-4d85-b3de-4665b1b2e023"). InnerVolumeSpecName "kube-api-access-v5qxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.244472 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf" (OuterVolumeSpecName: "kube-api-access-v6tpf") pod "38a95dc7-b051-4cb1-986f-7f85876e0168" (UID: "38a95dc7-b051-4cb1-986f-7f85876e0168"). InnerVolumeSpecName "kube-api-access-v6tpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.245170 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc" (OuterVolumeSpecName: "kube-api-access-x8gsc") pod "97ef70d1-94b3-4941-adda-70d5d62b25a5" (UID: "97ef70d1-94b3-4941-adda-70d5d62b25a5"). InnerVolumeSpecName "kube-api-access-x8gsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.246530 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9" (OuterVolumeSpecName: "kube-api-access-8mdp9") pod "11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" (UID: "11ab6999-cfdd-4db2-8c1b-73f3e29d38eb"). InnerVolumeSpecName "kube-api-access-8mdp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.343734 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8gsc\" (UniqueName: \"kubernetes.io/projected/97ef70d1-94b3-4941-adda-70d5d62b25a5-kube-api-access-x8gsc\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.343765 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5qxf\" (UniqueName: \"kubernetes.io/projected/91381de6-d500-4d85-b3de-4665b1b2e023-kube-api-access-v5qxf\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.343774 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdp9\" (UniqueName: \"kubernetes.io/projected/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb-kube-api-access-8mdp9\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.343783 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tpf\" (UniqueName: \"kubernetes.io/projected/38a95dc7-b051-4cb1-986f-7f85876e0168-kube-api-access-v6tpf\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.343794 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a95dc7-b051-4cb1-986f-7f85876e0168-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.515724 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wgltd" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.515709 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wgltd" event={"ID":"dd432bf0-b938-41fe-93d2-58613baa56aa","Type":"ContainerDied","Data":"e88e018dabed166b109ec84516c57fef0f1476f3eb935e3e7aa8e961ca7d3295"} Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.515950 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e88e018dabed166b109ec84516c57fef0f1476f3eb935e3e7aa8e961ca7d3295" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.518814 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2gnt7" event={"ID":"91381de6-d500-4d85-b3de-4665b1b2e023","Type":"ContainerDied","Data":"4fec1f126fbe00366e1ab32e8e9ca6edfdaa04d83b726f5eadf0a2793b08398d"} Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.518905 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fec1f126fbe00366e1ab32e8e9ca6edfdaa04d83b726f5eadf0a2793b08398d" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.519015 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2gnt7" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.523862 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9glh5" event={"ID":"97ef70d1-94b3-4941-adda-70d5d62b25a5","Type":"ContainerDied","Data":"b4ca7b2b0e752be299727f5d8e734d6d1e636bf909401f40cf67c853b85715b0"} Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.523923 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ca7b2b0e752be299727f5d8e734d6d1e636bf909401f40cf67c853b85715b0" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.524017 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9glh5" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.535244 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6a0e-account-create-update-v88nc" event={"ID":"38a95dc7-b051-4cb1-986f-7f85876e0168","Type":"ContainerDied","Data":"b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2"} Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.535613 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b894c948cd1093804d2bab205dbe2fd2f129e4de02f3703583b5ad15f84567f2" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.535941 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6a0e-account-create-update-v88nc" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.553233 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" event={"ID":"11ab6999-cfdd-4db2-8c1b-73f3e29d38eb","Type":"ContainerDied","Data":"2033324ede36fbd2afceab7717045af24abb67ddbb1ffb7685731afbd01a37a2"} Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.553559 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2033324ede36fbd2afceab7717045af24abb67ddbb1ffb7685731afbd01a37a2" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.557482 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ab44-account-create-update-p6cph" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.884414 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.953057 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts\") pod \"b8d3b859-107b-4311-bb6a-74236113699d\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.953225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjh77\" (UniqueName: \"kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77\") pod \"b8d3b859-107b-4311-bb6a-74236113699d\" (UID: \"b8d3b859-107b-4311-bb6a-74236113699d\") " Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.954234 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8d3b859-107b-4311-bb6a-74236113699d" (UID: "b8d3b859-107b-4311-bb6a-74236113699d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:11:51 crc kubenswrapper[4939]: I0318 17:11:51.957268 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77" (OuterVolumeSpecName: "kube-api-access-zjh77") pod "b8d3b859-107b-4311-bb6a-74236113699d" (UID: "b8d3b859-107b-4311-bb6a-74236113699d"). InnerVolumeSpecName "kube-api-access-zjh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:11:52 crc kubenswrapper[4939]: I0318 17:11:52.055883 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8d3b859-107b-4311-bb6a-74236113699d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:52 crc kubenswrapper[4939]: I0318 17:11:52.055958 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjh77\" (UniqueName: \"kubernetes.io/projected/b8d3b859-107b-4311-bb6a-74236113699d-kube-api-access-zjh77\") on node \"crc\" DevicePath \"\"" Mar 18 17:11:52 crc kubenswrapper[4939]: I0318 17:11:52.563409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" event={"ID":"b8d3b859-107b-4311-bb6a-74236113699d","Type":"ContainerDied","Data":"b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62"} Mar 18 17:11:52 crc kubenswrapper[4939]: I0318 17:11:52.563442 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1773-account-create-update-7b2m6" Mar 18 17:11:52 crc kubenswrapper[4939]: I0318 17:11:52.563448 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9834d72ab5eafb39fffd9b35780cf65b63fb567e462ce25c4ff5ba7b05dfd62" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.215967 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xcwnv"] Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.216837 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd432bf0-b938-41fe-93d2-58613baa56aa" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.216866 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd432bf0-b938-41fe-93d2-58613baa56aa" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.216882 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.216893 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.216928 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91381de6-d500-4d85-b3de-4665b1b2e023" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.216939 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="91381de6-d500-4d85-b3de-4665b1b2e023" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.217004 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ef70d1-94b3-4941-adda-70d5d62b25a5" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217029 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ef70d1-94b3-4941-adda-70d5d62b25a5" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.217048 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a95dc7-b051-4cb1-986f-7f85876e0168" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217058 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a95dc7-b051-4cb1-986f-7f85876e0168" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: E0318 17:11:53.217083 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d3b859-107b-4311-bb6a-74236113699d" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217093 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d3b859-107b-4311-bb6a-74236113699d" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217326 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d3b859-107b-4311-bb6a-74236113699d" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217355 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd432bf0-b938-41fe-93d2-58613baa56aa" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217375 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="91381de6-d500-4d85-b3de-4665b1b2e023" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217392 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217414 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a95dc7-b051-4cb1-986f-7f85876e0168" containerName="mariadb-account-create-update" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.217425 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ef70d1-94b3-4941-adda-70d5d62b25a5" containerName="mariadb-database-create" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.218649 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.221118 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mwdnm" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.230530 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.230825 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.242204 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xcwnv"] Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.377604 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nv49\" (UniqueName: \"kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.377653 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.377791 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.378020 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.479213 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nv49\" (UniqueName: \"kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.479267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.479315 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.479365 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.485639 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.486162 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.486404 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.500157 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nv49\" (UniqueName: \"kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49\") pod \"nova-cell0-conductor-db-sync-xcwnv\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.585135 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.687709 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:11:53 crc kubenswrapper[4939]: I0318 17:11:53.688349 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:11:54 crc kubenswrapper[4939]: I0318 17:11:54.072023 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xcwnv"] Mar 18 17:11:54 crc kubenswrapper[4939]: I0318 17:11:54.585702 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" event={"ID":"3d8eac24-43dc-491c-9aed-367fef1ff6fb","Type":"ContainerStarted","Data":"f0180b7f6dc28fab6e07761df562d3971873b3d9f705f062ae3c63114e47d6b8"} Mar 18 17:11:54 crc kubenswrapper[4939]: I0318 17:11:54.585836 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" event={"ID":"3d8eac24-43dc-491c-9aed-367fef1ff6fb","Type":"ContainerStarted","Data":"57d9354c59a7271a34f1dab27f44432b3dda6c5034da3f425f48a481d7716a48"} Mar 18 17:11:54 crc kubenswrapper[4939]: I0318 17:11:54.606490 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" podStartSLOduration=1.60646835 podStartE2EDuration="1.60646835s" podCreationTimestamp="2026-03-18 17:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:11:54.602628041 +0000 UTC m=+5679.201815672" watchObservedRunningTime="2026-03-18 17:11:54.60646835 +0000 UTC m=+5679.205655991" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.182139 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564232-5qxbt"] Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.197950 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.203043 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.203354 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.207088 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-5qxbt"] Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.213410 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.320399 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmbnb\" (UniqueName: \"kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb\") pod \"auto-csr-approver-29564232-5qxbt\" (UID: \"52a2d568-f1e4-4408-addb-9401388c46c1\") " pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.422482 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmbnb\" (UniqueName: \"kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb\") pod \"auto-csr-approver-29564232-5qxbt\" (UID: \"52a2d568-f1e4-4408-addb-9401388c46c1\") " pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.443867 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmbnb\" (UniqueName: \"kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb\") pod \"auto-csr-approver-29564232-5qxbt\" (UID: \"52a2d568-f1e4-4408-addb-9401388c46c1\") " pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:00 crc kubenswrapper[4939]: I0318 17:12:00.521324 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.019459 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-5qxbt"] Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.030799 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.189008 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" event={"ID":"52a2d568-f1e4-4408-addb-9401388c46c1","Type":"ContainerStarted","Data":"4f036a6e32f084b9dfa33f5e4b2888c69a4c69139766cd5a03364dfb25fdab96"} Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.192083 4939 generic.go:334] "Generic (PLEG): container finished" podID="3d8eac24-43dc-491c-9aed-367fef1ff6fb" containerID="f0180b7f6dc28fab6e07761df562d3971873b3d9f705f062ae3c63114e47d6b8" exitCode=0 Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.192138 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" event={"ID":"3d8eac24-43dc-491c-9aed-367fef1ff6fb","Type":"ContainerDied","Data":"f0180b7f6dc28fab6e07761df562d3971873b3d9f705f062ae3c63114e47d6b8"} Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.342809 4939 scope.go:117] "RemoveContainer" containerID="15427c09b9481ce0a7cfb24176bb1863c7186e6fa3205aba6701446722978d93" Mar 18 17:12:01 crc kubenswrapper[4939]: I0318 17:12:01.361246 4939 scope.go:117] "RemoveContainer" containerID="7d9143295243a3b3b8ef1369a9fbb9e5c0e209be3a65e4d982bbbbf53c10e644" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.541534 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.666802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle\") pod \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.666906 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data\") pod \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.666969 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts\") pod \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.667027 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nv49\" (UniqueName: \"kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49\") pod \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\" (UID: \"3d8eac24-43dc-491c-9aed-367fef1ff6fb\") " Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.674566 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts" (OuterVolumeSpecName: "scripts") pod "3d8eac24-43dc-491c-9aed-367fef1ff6fb" (UID: "3d8eac24-43dc-491c-9aed-367fef1ff6fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.676742 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49" (OuterVolumeSpecName: "kube-api-access-4nv49") pod "3d8eac24-43dc-491c-9aed-367fef1ff6fb" (UID: "3d8eac24-43dc-491c-9aed-367fef1ff6fb"). InnerVolumeSpecName "kube-api-access-4nv49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.693329 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data" (OuterVolumeSpecName: "config-data") pod "3d8eac24-43dc-491c-9aed-367fef1ff6fb" (UID: "3d8eac24-43dc-491c-9aed-367fef1ff6fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.698945 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d8eac24-43dc-491c-9aed-367fef1ff6fb" (UID: "3d8eac24-43dc-491c-9aed-367fef1ff6fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.769960 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nv49\" (UniqueName: \"kubernetes.io/projected/3d8eac24-43dc-491c-9aed-367fef1ff6fb-kube-api-access-4nv49\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.770018 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.770040 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:02 crc kubenswrapper[4939]: I0318 17:12:02.770060 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8eac24-43dc-491c-9aed-367fef1ff6fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.209705 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" event={"ID":"3d8eac24-43dc-491c-9aed-367fef1ff6fb","Type":"ContainerDied","Data":"57d9354c59a7271a34f1dab27f44432b3dda6c5034da3f425f48a481d7716a48"} Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.210119 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57d9354c59a7271a34f1dab27f44432b3dda6c5034da3f425f48a481d7716a48" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.209732 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xcwnv" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.211554 4939 generic.go:334] "Generic (PLEG): container finished" podID="52a2d568-f1e4-4408-addb-9401388c46c1" containerID="2278590aeb881433ea3bffff1a741927abf083e4ec69f28af73d0f1dc76e99fc" exitCode=0 Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.211601 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" event={"ID":"52a2d568-f1e4-4408-addb-9401388c46c1","Type":"ContainerDied","Data":"2278590aeb881433ea3bffff1a741927abf083e4ec69f28af73d0f1dc76e99fc"} Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.300936 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:12:03 crc kubenswrapper[4939]: E0318 17:12:03.301366 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8eac24-43dc-491c-9aed-367fef1ff6fb" containerName="nova-cell0-conductor-db-sync" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.301389 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8eac24-43dc-491c-9aed-367fef1ff6fb" containerName="nova-cell0-conductor-db-sync" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.301644 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8eac24-43dc-491c-9aed-367fef1ff6fb" containerName="nova-cell0-conductor-db-sync" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.302362 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.304524 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mwdnm" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.305685 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.309535 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.379588 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.380064 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.380152 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxdh\" (UniqueName: \"kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.481932 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.482053 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.482095 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxdh\" (UniqueName: \"kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.488006 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.490856 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.498095 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxdh\" (UniqueName: \"kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh\") pod \"nova-cell0-conductor-0\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:03 crc kubenswrapper[4939]: I0318 17:12:03.647425 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.168059 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.221900 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"750d2365-1615-47b7-a95d-178356744f89","Type":"ContainerStarted","Data":"8fc21185668beb436ce40b90f59d0ad4d27f16ff52a8b9dd03aca6dbb46e6f07"} Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.587680 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.601681 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmbnb\" (UniqueName: \"kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb\") pod \"52a2d568-f1e4-4408-addb-9401388c46c1\" (UID: \"52a2d568-f1e4-4408-addb-9401388c46c1\") " Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.608828 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb" (OuterVolumeSpecName: "kube-api-access-jmbnb") pod "52a2d568-f1e4-4408-addb-9401388c46c1" (UID: "52a2d568-f1e4-4408-addb-9401388c46c1"). InnerVolumeSpecName "kube-api-access-jmbnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:04 crc kubenswrapper[4939]: I0318 17:12:04.609136 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmbnb\" (UniqueName: \"kubernetes.io/projected/52a2d568-f1e4-4408-addb-9401388c46c1-kube-api-access-jmbnb\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.240164 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.240159 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-5qxbt" event={"ID":"52a2d568-f1e4-4408-addb-9401388c46c1","Type":"ContainerDied","Data":"4f036a6e32f084b9dfa33f5e4b2888c69a4c69139766cd5a03364dfb25fdab96"} Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.240852 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f036a6e32f084b9dfa33f5e4b2888c69a4c69139766cd5a03364dfb25fdab96" Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.254079 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"750d2365-1615-47b7-a95d-178356744f89","Type":"ContainerStarted","Data":"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77"} Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.255655 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.288491 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.288447472 podStartE2EDuration="2.288447472s" podCreationTimestamp="2026-03-18 17:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:05.283967515 +0000 UTC m=+5689.883155146" watchObservedRunningTime="2026-03-18 17:12:05.288447472 +0000 UTC m=+5689.887635093" Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.658461 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-pfx2h"] Mar 18 17:12:05 crc kubenswrapper[4939]: I0318 17:12:05.666679 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-pfx2h"] Mar 18 17:12:06 crc kubenswrapper[4939]: I0318 17:12:06.152320 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec9676a-ae7b-49f7-9edf-d1d8097ca445" path="/var/lib/kubelet/pods/bec9676a-ae7b-49f7-9edf-d1d8097ca445/volumes" Mar 18 17:12:13 crc kubenswrapper[4939]: I0318 17:12:13.685235 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.161812 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-26vcj"] Mar 18 17:12:14 crc kubenswrapper[4939]: E0318 17:12:14.162192 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a2d568-f1e4-4408-addb-9401388c46c1" containerName="oc" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.162206 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a2d568-f1e4-4408-addb-9401388c46c1" containerName="oc" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.162420 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a2d568-f1e4-4408-addb-9401388c46c1" containerName="oc" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.163520 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.170928 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.171026 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.178841 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-26vcj"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.285059 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.285144 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5x9\" (UniqueName: \"kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.285193 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.285261 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.334646 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.336048 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.342450 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.364366 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.387586 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5x9\" (UniqueName: \"kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.387644 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.387700 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.387790 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.395562 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.396939 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.399355 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.405386 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.429221 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.437523 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.451787 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.452990 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.455398 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.464270 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.465523 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5x9\" (UniqueName: \"kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9\") pod \"nova-cell0-cell-mapping-26vcj\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.518416 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.518496 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.524985 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69brh\" (UniqueName: \"kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525077 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525129 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525145 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525330 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525351 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8l4\" (UniqueName: \"kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525424 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525491 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzs62\" (UniqueName: \"kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.525722 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.530799 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.563222 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.600340 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.600468 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.644268 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.646181 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.647429 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668440 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8l4\" (UniqueName: \"kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668570 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668642 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzs62\" (UniqueName: \"kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668773 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddfd\" (UniqueName: \"kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668821 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668843 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668868 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668895 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668918 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69brh\" (UniqueName: \"kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668948 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.668994 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.669020 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.669200 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.682805 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.683453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.685476 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.689915 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.690358 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69brh\" (UniqueName: \"kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.691083 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.691940 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.698985 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.699759 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.699977 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.704453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzs62\" (UniqueName: \"kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.710182 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8l4\" (UniqueName: \"kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4\") pod \"nova-api-0\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " pod="openstack/nova-api-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.770802 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771345 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771371 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771416 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zddfd\" (UniqueName: \"kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771452 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771479 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771574 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771600 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm88m\" (UniqueName: \"kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.771680 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.772793 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.778748 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.778752 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.791171 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddfd\" (UniqueName: \"kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd\") pod \"nova-metadata-0\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " pod="openstack/nova-metadata-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.862190 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.873232 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.873285 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm88m\" (UniqueName: \"kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.873419 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.873449 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.873472 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.874928 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.874959 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.875152 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.876821 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.893309 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm88m\" (UniqueName: \"kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m\") pod \"dnsmasq-dns-64bb9495b5-zzdlk\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.900335 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:14 crc kubenswrapper[4939]: I0318 17:12:14.976830 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.080617 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.092201 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.118853 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-26vcj"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.244725 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5cpf4"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.245914 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.295336 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.295626 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.323709 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5cpf4"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.372193 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-26vcj" event={"ID":"000a84e0-4410-4303-9bea-3ad28e8c9c99","Type":"ContainerStarted","Data":"9d59f4e2a2b6b4468db769aa1ff6562d9719b7e3d23c209ccbb498bca631f742"} Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.400317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crdxn\" (UniqueName: \"kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.400426 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.400461 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.401068 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.410977 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.470836 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.513370 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.513442 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.513590 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.513720 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crdxn\" (UniqueName: \"kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.518301 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.518663 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.522513 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.537417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crdxn\" (UniqueName: \"kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn\") pod \"nova-cell1-conductor-db-sync-5cpf4\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.618202 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.666752 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.758081 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:12:15 crc kubenswrapper[4939]: W0318 17:12:15.800118 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1472528d_7a5a_497d_9413_6692a3f01ccd.slice/crio-fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb WatchSource:0}: Error finding container fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb: Status 404 returned error can't find the container with id fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb Mar 18 17:12:15 crc kubenswrapper[4939]: I0318 17:12:15.806378 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:15 crc kubenswrapper[4939]: W0318 17:12:15.810673 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ea7509_7241_4691_bb15_1a2d752a8733.slice/crio-635c9328e782b03dd196d2910c6a7292464c80763a673936fc22f472fa267dca WatchSource:0}: Error finding container 635c9328e782b03dd196d2910c6a7292464c80763a673936fc22f472fa267dca: Status 404 returned error can't find the container with id 635c9328e782b03dd196d2910c6a7292464c80763a673936fc22f472fa267dca Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.286401 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5cpf4"] Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.408470 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-26vcj" event={"ID":"000a84e0-4410-4303-9bea-3ad28e8c9c99","Type":"ContainerStarted","Data":"a336e182c56f4324ff226b86a44102fbe74de992b38d27033ef017d86893abc1"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.435131 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerStarted","Data":"c6cdc70a5722a77636708b3dbb6f7b1925aaa5fb363d13a963133f9a7039494e"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.435425 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerStarted","Data":"fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.450448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerStarted","Data":"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.450492 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerStarted","Data":"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.450515 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerStarted","Data":"143d661c5d2cd2b0a47876f20e3af96c73db6369c6ba86ee44acb293aabf3f1e"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.458568 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14cb6ca6-352c-4817-9c39-86716e05a3ca","Type":"ContainerStarted","Data":"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.458617 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14cb6ca6-352c-4817-9c39-86716e05a3ca","Type":"ContainerStarted","Data":"8c72086b7c12dd5c8f1981f46b7b83f0994b0f2d8976e42c6ef38fe2526c6e69"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.470278 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9eee740-5028-41c2-b9ba-1c18218d131e","Type":"ContainerStarted","Data":"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.470326 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9eee740-5028-41c2-b9ba-1c18218d131e","Type":"ContainerStarted","Data":"e113c2fe754a54107b05513dae8c8ac8bcf397d92c09bce8f42ee7465b30500c"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.484018 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerStarted","Data":"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.484081 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerStarted","Data":"635c9328e782b03dd196d2910c6a7292464c80763a673936fc22f472fa267dca"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.498359 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" event={"ID":"1195c909-085a-458c-8366-46ef24721def","Type":"ContainerStarted","Data":"27c5c609ed420cec2a99e6954c797304f7b77564dff399afe5a2917158eb0086"} Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.561930 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-26vcj" podStartSLOduration=2.561909249 podStartE2EDuration="2.561909249s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:16.554687314 +0000 UTC m=+5701.153874935" watchObservedRunningTime="2026-03-18 17:12:16.561909249 +0000 UTC m=+5701.161096870" Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.590185 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5901624610000002 podStartE2EDuration="2.590162461s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:16.578718306 +0000 UTC m=+5701.177905937" watchObservedRunningTime="2026-03-18 17:12:16.590162461 +0000 UTC m=+5701.189350082" Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.630115 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.630093984 podStartE2EDuration="2.630093984s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:16.603109628 +0000 UTC m=+5701.202297249" watchObservedRunningTime="2026-03-18 17:12:16.630093984 +0000 UTC m=+5701.229281605" Mar 18 17:12:16 crc kubenswrapper[4939]: I0318 17:12:16.649240 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.649221867 podStartE2EDuration="2.649221867s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:16.640347775 +0000 UTC m=+5701.239535396" watchObservedRunningTime="2026-03-18 17:12:16.649221867 +0000 UTC m=+5701.248409478" Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.508196 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" event={"ID":"1195c909-085a-458c-8366-46ef24721def","Type":"ContainerStarted","Data":"f9dcf6d8169612a82e1c310e93b99decd9b0cac4754a45dc9383ff633825186e"} Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.511485 4939 generic.go:334] "Generic (PLEG): container finished" podID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerID="c6cdc70a5722a77636708b3dbb6f7b1925aaa5fb363d13a963133f9a7039494e" exitCode=0 Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.511592 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerDied","Data":"c6cdc70a5722a77636708b3dbb6f7b1925aaa5fb363d13a963133f9a7039494e"} Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.514575 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerStarted","Data":"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050"} Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.533216 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" podStartSLOduration=2.533195752 podStartE2EDuration="2.533195752s" podCreationTimestamp="2026-03-18 17:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:17.529757495 +0000 UTC m=+5702.128945126" watchObservedRunningTime="2026-03-18 17:12:17.533195752 +0000 UTC m=+5702.132383373" Mar 18 17:12:17 crc kubenswrapper[4939]: I0318 17:12:17.553319 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.553300783 podStartE2EDuration="3.553300783s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:17.549347231 +0000 UTC m=+5702.148534852" watchObservedRunningTime="2026-03-18 17:12:17.553300783 +0000 UTC m=+5702.152488404" Mar 18 17:12:18 crc kubenswrapper[4939]: I0318 17:12:18.525718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerStarted","Data":"d298f32db5b93b53a10bbc4f67e3b805a3468d18c812b237dc30b4d7d53ff443"} Mar 18 17:12:18 crc kubenswrapper[4939]: I0318 17:12:18.526882 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:18 crc kubenswrapper[4939]: I0318 17:12:18.553865 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" podStartSLOduration=4.553839406 podStartE2EDuration="4.553839406s" podCreationTimestamp="2026-03-18 17:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:18.544317746 +0000 UTC m=+5703.143505367" watchObservedRunningTime="2026-03-18 17:12:18.553839406 +0000 UTC m=+5703.153027027" Mar 18 17:12:19 crc kubenswrapper[4939]: I0318 17:12:19.539371 4939 generic.go:334] "Generic (PLEG): container finished" podID="1195c909-085a-458c-8366-46ef24721def" containerID="f9dcf6d8169612a82e1c310e93b99decd9b0cac4754a45dc9383ff633825186e" exitCode=0 Mar 18 17:12:19 crc kubenswrapper[4939]: I0318 17:12:19.539483 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" event={"ID":"1195c909-085a-458c-8366-46ef24721def","Type":"ContainerDied","Data":"f9dcf6d8169612a82e1c310e93b99decd9b0cac4754a45dc9383ff633825186e"} Mar 18 17:12:19 crc kubenswrapper[4939]: I0318 17:12:19.863030 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 17:12:19 crc kubenswrapper[4939]: I0318 17:12:19.901360 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:20 crc kubenswrapper[4939]: I0318 17:12:20.553767 4939 generic.go:334] "Generic (PLEG): container finished" podID="000a84e0-4410-4303-9bea-3ad28e8c9c99" containerID="a336e182c56f4324ff226b86a44102fbe74de992b38d27033ef017d86893abc1" exitCode=0 Mar 18 17:12:20 crc kubenswrapper[4939]: I0318 17:12:20.553790 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-26vcj" event={"ID":"000a84e0-4410-4303-9bea-3ad28e8c9c99","Type":"ContainerDied","Data":"a336e182c56f4324ff226b86a44102fbe74de992b38d27033ef017d86893abc1"} Mar 18 17:12:20 crc kubenswrapper[4939]: I0318 17:12:20.972853 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.093234 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data\") pod \"1195c909-085a-458c-8366-46ef24721def\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.093334 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crdxn\" (UniqueName: \"kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn\") pod \"1195c909-085a-458c-8366-46ef24721def\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.093363 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle\") pod \"1195c909-085a-458c-8366-46ef24721def\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.093475 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts\") pod \"1195c909-085a-458c-8366-46ef24721def\" (UID: \"1195c909-085a-458c-8366-46ef24721def\") " Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.099764 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn" (OuterVolumeSpecName: "kube-api-access-crdxn") pod "1195c909-085a-458c-8366-46ef24721def" (UID: "1195c909-085a-458c-8366-46ef24721def"). InnerVolumeSpecName "kube-api-access-crdxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.100846 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts" (OuterVolumeSpecName: "scripts") pod "1195c909-085a-458c-8366-46ef24721def" (UID: "1195c909-085a-458c-8366-46ef24721def"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.118362 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1195c909-085a-458c-8366-46ef24721def" (UID: "1195c909-085a-458c-8366-46ef24721def"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.119691 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data" (OuterVolumeSpecName: "config-data") pod "1195c909-085a-458c-8366-46ef24721def" (UID: "1195c909-085a-458c-8366-46ef24721def"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.196130 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crdxn\" (UniqueName: \"kubernetes.io/projected/1195c909-085a-458c-8366-46ef24721def-kube-api-access-crdxn\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.196171 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.196185 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.196196 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1195c909-085a-458c-8366-46ef24721def-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.565533 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" event={"ID":"1195c909-085a-458c-8366-46ef24721def","Type":"ContainerDied","Data":"27c5c609ed420cec2a99e6954c797304f7b77564dff399afe5a2917158eb0086"} Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.565605 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c5c609ed420cec2a99e6954c797304f7b77564dff399afe5a2917158eb0086" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.565707 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5cpf4" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.656955 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:12:21 crc kubenswrapper[4939]: E0318 17:12:21.659132 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1195c909-085a-458c-8366-46ef24721def" containerName="nova-cell1-conductor-db-sync" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.659158 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1195c909-085a-458c-8366-46ef24721def" containerName="nova-cell1-conductor-db-sync" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.659391 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1195c909-085a-458c-8366-46ef24721def" containerName="nova-cell1-conductor-db-sync" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.660026 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.664085 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.671806 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.806348 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wd8\" (UniqueName: \"kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.806435 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.806461 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.908483 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wd8\" (UniqueName: \"kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.908949 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.908998 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.913671 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.914240 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.929771 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wd8\" (UniqueName: \"kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8\") pod \"nova-cell1-conductor-0\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.987842 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:21 crc kubenswrapper[4939]: I0318 17:12:21.992476 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.011533 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts\") pod \"000a84e0-4410-4303-9bea-3ad28e8c9c99\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.011581 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle\") pod \"000a84e0-4410-4303-9bea-3ad28e8c9c99\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.011632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data\") pod \"000a84e0-4410-4303-9bea-3ad28e8c9c99\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.011701 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz5x9\" (UniqueName: \"kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9\") pod \"000a84e0-4410-4303-9bea-3ad28e8c9c99\" (UID: \"000a84e0-4410-4303-9bea-3ad28e8c9c99\") " Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.062295 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts" (OuterVolumeSpecName: "scripts") pod "000a84e0-4410-4303-9bea-3ad28e8c9c99" (UID: "000a84e0-4410-4303-9bea-3ad28e8c9c99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.062447 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9" (OuterVolumeSpecName: "kube-api-access-vz5x9") pod "000a84e0-4410-4303-9bea-3ad28e8c9c99" (UID: "000a84e0-4410-4303-9bea-3ad28e8c9c99"). InnerVolumeSpecName "kube-api-access-vz5x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.068001 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "000a84e0-4410-4303-9bea-3ad28e8c9c99" (UID: "000a84e0-4410-4303-9bea-3ad28e8c9c99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.097178 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data" (OuterVolumeSpecName: "config-data") pod "000a84e0-4410-4303-9bea-3ad28e8c9c99" (UID: "000a84e0-4410-4303-9bea-3ad28e8c9c99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.114261 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz5x9\" (UniqueName: \"kubernetes.io/projected/000a84e0-4410-4303-9bea-3ad28e8c9c99-kube-api-access-vz5x9\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.114313 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.114327 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.114337 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a84e0-4410-4303-9bea-3ad28e8c9c99-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.469086 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.576691 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-26vcj" event={"ID":"000a84e0-4410-4303-9bea-3ad28e8c9c99","Type":"ContainerDied","Data":"9d59f4e2a2b6b4468db769aa1ff6562d9719b7e3d23c209ccbb498bca631f742"} Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.576736 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d59f4e2a2b6b4468db769aa1ff6562d9719b7e3d23c209ccbb498bca631f742" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.576736 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-26vcj" Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.578043 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69109a3f-f7d3-48db-a867-813bc5f6929d","Type":"ContainerStarted","Data":"cc24155d4be65771578682e48f633296813ab9f9e255709d5b978d4135ec28b9"} Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.784007 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.784352 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="14cb6ca6-352c-4817-9c39-86716e05a3ca" containerName="nova-scheduler-scheduler" containerID="cri-o://9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658" gracePeriod=30 Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.797967 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.798298 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-log" containerID="cri-o://9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" gracePeriod=30 Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.798379 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-api" containerID="cri-o://ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" gracePeriod=30 Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.808706 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.809123 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-log" containerID="cri-o://74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" gracePeriod=30 Mar 18 17:12:22 crc kubenswrapper[4939]: I0318 17:12:22.809220 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-metadata" containerID="cri-o://48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" gracePeriod=30 Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.426572 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.437023 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zddfd\" (UniqueName: \"kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd\") pod \"93ea7509-7241-4691-bb15-1a2d752a8733\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.437074 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs\") pod \"93ea7509-7241-4691-bb15-1a2d752a8733\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.437114 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data\") pod \"93ea7509-7241-4691-bb15-1a2d752a8733\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.440546 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.441609 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs" (OuterVolumeSpecName: "logs") pod "93ea7509-7241-4691-bb15-1a2d752a8733" (UID: "93ea7509-7241-4691-bb15-1a2d752a8733"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.442025 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd" (OuterVolumeSpecName: "kube-api-access-zddfd") pod "93ea7509-7241-4691-bb15-1a2d752a8733" (UID: "93ea7509-7241-4691-bb15-1a2d752a8733"). InnerVolumeSpecName "kube-api-access-zddfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.480575 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data" (OuterVolumeSpecName: "config-data") pod "93ea7509-7241-4691-bb15-1a2d752a8733" (UID: "93ea7509-7241-4691-bb15-1a2d752a8733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.538099 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle\") pod \"93ea7509-7241-4691-bb15-1a2d752a8733\" (UID: \"93ea7509-7241-4691-bb15-1a2d752a8733\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.538499 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zddfd\" (UniqueName: \"kubernetes.io/projected/93ea7509-7241-4691-bb15-1a2d752a8733-kube-api-access-zddfd\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.538527 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ea7509-7241-4691-bb15-1a2d752a8733-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.538537 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.559770 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93ea7509-7241-4691-bb15-1a2d752a8733" (UID: "93ea7509-7241-4691-bb15-1a2d752a8733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.588875 4939 generic.go:334] "Generic (PLEG): container finished" podID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerID="ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" exitCode=0 Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.588912 4939 generic.go:334] "Generic (PLEG): container finished" podID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerID="9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" exitCode=143 Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.588927 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.588926 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerDied","Data":"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.589045 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerDied","Data":"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.589055 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66d1eb23-ba89-41ba-a26d-a821fb7cd21b","Type":"ContainerDied","Data":"143d661c5d2cd2b0a47876f20e3af96c73db6369c6ba86ee44acb293aabf3f1e"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.589070 4939 scope.go:117] "RemoveContainer" containerID="ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.590707 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69109a3f-f7d3-48db-a867-813bc5f6929d","Type":"ContainerStarted","Data":"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.590736 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.593514 4939 generic.go:334] "Generic (PLEG): container finished" podID="93ea7509-7241-4691-bb15-1a2d752a8733" containerID="48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" exitCode=0 Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.593537 4939 generic.go:334] "Generic (PLEG): container finished" podID="93ea7509-7241-4691-bb15-1a2d752a8733" containerID="74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" exitCode=143 Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.593549 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerDied","Data":"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.593584 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerDied","Data":"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.593599 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ea7509-7241-4691-bb15-1a2d752a8733","Type":"ContainerDied","Data":"635c9328e782b03dd196d2910c6a7292464c80763a673936fc22f472fa267dca"} Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.595764 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.609746 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.609729751 podStartE2EDuration="2.609729751s" podCreationTimestamp="2026-03-18 17:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:23.605838371 +0000 UTC m=+5708.205025982" watchObservedRunningTime="2026-03-18 17:12:23.609729751 +0000 UTC m=+5708.208917372" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.621759 4939 scope.go:117] "RemoveContainer" containerID="9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.639563 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data\") pod \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.639625 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m8l4\" (UniqueName: \"kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4\") pod \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.639805 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs\") pod \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.640197 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs" (OuterVolumeSpecName: "logs") pod "66d1eb23-ba89-41ba-a26d-a821fb7cd21b" (UID: "66d1eb23-ba89-41ba-a26d-a821fb7cd21b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.640421 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle\") pod \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\" (UID: \"66d1eb23-ba89-41ba-a26d-a821fb7cd21b\") " Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.640969 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ea7509-7241-4691-bb15-1a2d752a8733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.640999 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.659673 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4" (OuterVolumeSpecName: "kube-api-access-2m8l4") pod "66d1eb23-ba89-41ba-a26d-a821fb7cd21b" (UID: "66d1eb23-ba89-41ba-a26d-a821fb7cd21b"). InnerVolumeSpecName "kube-api-access-2m8l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.664105 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.674012 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.685082 4939 scope.go:117] "RemoveContainer" containerID="ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.685756 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487\": container with ID starting with ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487 not found: ID does not exist" containerID="ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.685786 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487"} err="failed to get container status \"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487\": rpc error: code = NotFound desc = could not find container \"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487\": container with ID starting with ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.685806 4939 scope.go:117] "RemoveContainer" containerID="9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.686725 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.686758 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.686772 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3\": container with ID starting with 9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3 not found: ID does not exist" containerID="9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.686804 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3"} err="failed to get container status \"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3\": rpc error: code = NotFound desc = could not find container \"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3\": container with ID starting with 9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.686823 4939 scope.go:117] "RemoveContainer" containerID="ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.688832 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487"} err="failed to get container status \"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487\": rpc error: code = NotFound desc = could not find container \"ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487\": container with ID starting with ffec3fc573122e7766783239dc5435de042822121bf741534b3edb847728a487 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.688864 4939 scope.go:117] "RemoveContainer" containerID="9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.689072 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data" (OuterVolumeSpecName: "config-data") pod "66d1eb23-ba89-41ba-a26d-a821fb7cd21b" (UID: "66d1eb23-ba89-41ba-a26d-a821fb7cd21b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.690795 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3"} err="failed to get container status \"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3\": rpc error: code = NotFound desc = could not find container \"9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3\": container with ID starting with 9e461a0acaa4ead5fec1e6d22f0d469b5b40b89ee07a885e7dfa2eb73b2408d3 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.690844 4939 scope.go:117] "RemoveContainer" containerID="48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698091 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.698700 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000a84e0-4410-4303-9bea-3ad28e8c9c99" containerName="nova-manage" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698728 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="000a84e0-4410-4303-9bea-3ad28e8c9c99" containerName="nova-manage" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.698748 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-metadata" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698757 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-metadata" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.698771 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-log" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698778 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-log" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.698802 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-api" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698810 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-api" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.698827 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-log" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.698836 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-log" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.699060 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-log" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.699082 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-api" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.699095 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="000a84e0-4410-4303-9bea-3ad28e8c9c99" containerName="nova-manage" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.699112 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" containerName="nova-metadata-metadata" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.699128 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" containerName="nova-api-log" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.700279 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.702544 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66d1eb23-ba89-41ba-a26d-a821fb7cd21b" (UID: "66d1eb23-ba89-41ba-a26d-a821fb7cd21b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.703470 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.709201 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.710816 4939 scope.go:117] "RemoveContainer" containerID="74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.733048 4939 scope.go:117] "RemoveContainer" containerID="48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.733474 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050\": container with ID starting with 48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050 not found: ID does not exist" containerID="48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.733518 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050"} err="failed to get container status \"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050\": rpc error: code = NotFound desc = could not find container \"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050\": container with ID starting with 48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.733539 4939 scope.go:117] "RemoveContainer" containerID="74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" Mar 18 17:12:23 crc kubenswrapper[4939]: E0318 17:12:23.733879 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88\": container with ID starting with 74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88 not found: ID does not exist" containerID="74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.733915 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88"} err="failed to get container status \"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88\": rpc error: code = NotFound desc = could not find container \"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88\": container with ID starting with 74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.733929 4939 scope.go:117] "RemoveContainer" containerID="48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.734187 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050"} err="failed to get container status \"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050\": rpc error: code = NotFound desc = could not find container \"48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050\": container with ID starting with 48eeef735001345da539297b8c9446bdddc520dd4b27db8ff9adc5799a585050 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.734205 4939 scope.go:117] "RemoveContainer" containerID="74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.734467 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88"} err="failed to get container status \"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88\": rpc error: code = NotFound desc = could not find container \"74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88\": container with ID starting with 74e40373a0b842e5333a61064806e057dc0beb0f23a092709fc358a037bdeb88 not found: ID does not exist" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742434 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742550 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742612 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4vqg\" (UniqueName: \"kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742839 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742861 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.742872 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m8l4\" (UniqueName: \"kubernetes.io/projected/66d1eb23-ba89-41ba-a26d-a821fb7cd21b-kube-api-access-2m8l4\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.843954 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4vqg\" (UniqueName: \"kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.844082 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.844136 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.844185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.844655 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.847731 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.848354 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.862188 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4vqg\" (UniqueName: \"kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg\") pod \"nova-metadata-0\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " pod="openstack/nova-metadata-0" Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.984433 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:23 crc kubenswrapper[4939]: I0318 17:12:23.995278 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.010110 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.012593 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.018631 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.025344 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.028148 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.145476 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d1eb23-ba89-41ba-a26d-a821fb7cd21b" path="/var/lib/kubelet/pods/66d1eb23-ba89-41ba-a26d-a821fb7cd21b/volumes" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.146857 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ea7509-7241-4691-bb15-1a2d752a8733" path="/var/lib/kubelet/pods/93ea7509-7241-4691-bb15-1a2d752a8733/volumes" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.150196 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.150276 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5grt6\" (UniqueName: \"kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.150342 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.150365 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.252274 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.252332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5grt6\" (UniqueName: \"kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.252398 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.252427 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.252770 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.257480 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.258302 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.269242 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5grt6\" (UniqueName: \"kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6\") pod \"nova-api-0\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.328741 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.451489 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:24 crc kubenswrapper[4939]: W0318 17:12:24.453714 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53ca19bf_693e_4435_b356_0d094d179f91.slice/crio-8e9385558e4324ac4eb3c30d45680b8713e17fe9c7db7c7173f9126facbe3913 WatchSource:0}: Error finding container 8e9385558e4324ac4eb3c30d45680b8713e17fe9c7db7c7173f9126facbe3913: Status 404 returned error can't find the container with id 8e9385558e4324ac4eb3c30d45680b8713e17fe9c7db7c7173f9126facbe3913 Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.605859 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerStarted","Data":"8e9385558e4324ac4eb3c30d45680b8713e17fe9c7db7c7173f9126facbe3913"} Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.735660 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:24 crc kubenswrapper[4939]: W0318 17:12:24.739125 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cf92730_058c_4d87_a615_e6cd554f88ca.slice/crio-b68d5acfe1df63d9b96fa972868d0f54c2525fabee2848d583bbe7c6231a3f63 WatchSource:0}: Error finding container b68d5acfe1df63d9b96fa972868d0f54c2525fabee2848d583bbe7c6231a3f63: Status 404 returned error can't find the container with id b68d5acfe1df63d9b96fa972868d0f54c2525fabee2848d583bbe7c6231a3f63 Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.901329 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:24 crc kubenswrapper[4939]: I0318 17:12:24.913466 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.094445 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.148482 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.149049 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="dnsmasq-dns" containerID="cri-o://32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134" gracePeriod=10 Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.624615 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerStarted","Data":"b9624f9d49d281d16dae1ce675043303b4fa0dd0773220e5d20a515eb36aaf9a"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.624674 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerStarted","Data":"a715d137a31b817114edd3944436bdcbf6bf8adbd01e1e7b0816b22f4c4c0081"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.624700 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerStarted","Data":"b68d5acfe1df63d9b96fa972868d0f54c2525fabee2848d583bbe7c6231a3f63"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.633663 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerStarted","Data":"9e62d0fd4c5ea27e4b9cc2450588900212ef6a97320fbdd6e3a90b3c4ab43f1a"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.633718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerStarted","Data":"1baa94798381419bbad43b444e9c76882d3b6f9a89c5caa688c81513fec773e7"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.636569 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.639084 4939 generic.go:334] "Generic (PLEG): container finished" podID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerID="32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134" exitCode=0 Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.639176 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" event={"ID":"da8231f7-455b-4a7f-98bb-2cde842a5332","Type":"ContainerDied","Data":"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.639223 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" event={"ID":"da8231f7-455b-4a7f-98bb-2cde842a5332","Type":"ContainerDied","Data":"abc8edacabd61c8aa5720df5426eb0c7da33e8307bd14d71f0f7e08cc410362b"} Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.639245 4939 scope.go:117] "RemoveContainer" containerID="32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.654354 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.659627 4939 scope.go:117] "RemoveContainer" containerID="309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.672750 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.672728155 podStartE2EDuration="2.672728155s" podCreationTimestamp="2026-03-18 17:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:25.652669386 +0000 UTC m=+5710.251857007" watchObservedRunningTime="2026-03-18 17:12:25.672728155 +0000 UTC m=+5710.271915776" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.693635 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.693616618 podStartE2EDuration="2.693616618s" podCreationTimestamp="2026-03-18 17:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:25.691101997 +0000 UTC m=+5710.290289618" watchObservedRunningTime="2026-03-18 17:12:25.693616618 +0000 UTC m=+5710.292804239" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.713667 4939 scope.go:117] "RemoveContainer" containerID="32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134" Mar 18 17:12:25 crc kubenswrapper[4939]: E0318 17:12:25.717783 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134\": container with ID starting with 32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134 not found: ID does not exist" containerID="32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.717849 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134"} err="failed to get container status \"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134\": rpc error: code = NotFound desc = could not find container \"32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134\": container with ID starting with 32081b8684eb9bf414f32a6345364e3e9f49e45728d991401633760d64e26134 not found: ID does not exist" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.717881 4939 scope.go:117] "RemoveContainer" containerID="309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41" Mar 18 17:12:25 crc kubenswrapper[4939]: E0318 17:12:25.718258 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41\": container with ID starting with 309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41 not found: ID does not exist" containerID="309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.718290 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41"} err="failed to get container status \"309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41\": rpc error: code = NotFound desc = could not find container \"309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41\": container with ID starting with 309b042ce2497167dfb64ddd7ed5ce3040762a956f0005e817308b04c33ebb41 not found: ID does not exist" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.780811 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb\") pod \"da8231f7-455b-4a7f-98bb-2cde842a5332\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.780965 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config\") pod \"da8231f7-455b-4a7f-98bb-2cde842a5332\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.781033 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb\") pod \"da8231f7-455b-4a7f-98bb-2cde842a5332\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.781150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc\") pod \"da8231f7-455b-4a7f-98bb-2cde842a5332\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.781243 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7ndx\" (UniqueName: \"kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx\") pod \"da8231f7-455b-4a7f-98bb-2cde842a5332\" (UID: \"da8231f7-455b-4a7f-98bb-2cde842a5332\") " Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.789830 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx" (OuterVolumeSpecName: "kube-api-access-x7ndx") pod "da8231f7-455b-4a7f-98bb-2cde842a5332" (UID: "da8231f7-455b-4a7f-98bb-2cde842a5332"). InnerVolumeSpecName "kube-api-access-x7ndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.832071 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da8231f7-455b-4a7f-98bb-2cde842a5332" (UID: "da8231f7-455b-4a7f-98bb-2cde842a5332"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.838230 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da8231f7-455b-4a7f-98bb-2cde842a5332" (UID: "da8231f7-455b-4a7f-98bb-2cde842a5332"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.845070 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config" (OuterVolumeSpecName: "config") pod "da8231f7-455b-4a7f-98bb-2cde842a5332" (UID: "da8231f7-455b-4a7f-98bb-2cde842a5332"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.852884 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da8231f7-455b-4a7f-98bb-2cde842a5332" (UID: "da8231f7-455b-4a7f-98bb-2cde842a5332"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.883693 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7ndx\" (UniqueName: \"kubernetes.io/projected/da8231f7-455b-4a7f-98bb-2cde842a5332-kube-api-access-x7ndx\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.883736 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.883748 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.883761 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:25 crc kubenswrapper[4939]: I0318 17:12:25.883774 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da8231f7-455b-4a7f-98bb-2cde842a5332-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:26 crc kubenswrapper[4939]: E0318 17:12:26.217381 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda8231f7_455b_4a7f_98bb_2cde842a5332.slice/crio-abc8edacabd61c8aa5720df5426eb0c7da33e8307bd14d71f0f7e08cc410362b\": RecentStats: unable to find data in memory cache]" Mar 18 17:12:26 crc kubenswrapper[4939]: I0318 17:12:26.651574 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-wlhm9" Mar 18 17:12:26 crc kubenswrapper[4939]: I0318 17:12:26.686342 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:12:26 crc kubenswrapper[4939]: I0318 17:12:26.701329 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-wlhm9"] Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.534937 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.661001 4939 generic.go:334] "Generic (PLEG): container finished" podID="14cb6ca6-352c-4817-9c39-86716e05a3ca" containerID="9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658" exitCode=0 Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.661060 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.661078 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14cb6ca6-352c-4817-9c39-86716e05a3ca","Type":"ContainerDied","Data":"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658"} Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.661435 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14cb6ca6-352c-4817-9c39-86716e05a3ca","Type":"ContainerDied","Data":"8c72086b7c12dd5c8f1981f46b7b83f0994b0f2d8976e42c6ef38fe2526c6e69"} Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.661473 4939 scope.go:117] "RemoveContainer" containerID="9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.680323 4939 scope.go:117] "RemoveContainer" containerID="9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658" Mar 18 17:12:27 crc kubenswrapper[4939]: E0318 17:12:27.680757 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658\": container with ID starting with 9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658 not found: ID does not exist" containerID="9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.680808 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658"} err="failed to get container status \"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658\": rpc error: code = NotFound desc = could not find container \"9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658\": container with ID starting with 9930a85cfe8d250b9bed592b0167eef2d392d644ef8afcd272b5dfbc20b2f658 not found: ID does not exist" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.717153 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle\") pod \"14cb6ca6-352c-4817-9c39-86716e05a3ca\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.717263 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69brh\" (UniqueName: \"kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh\") pod \"14cb6ca6-352c-4817-9c39-86716e05a3ca\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.717391 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data\") pod \"14cb6ca6-352c-4817-9c39-86716e05a3ca\" (UID: \"14cb6ca6-352c-4817-9c39-86716e05a3ca\") " Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.730246 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh" (OuterVolumeSpecName: "kube-api-access-69brh") pod "14cb6ca6-352c-4817-9c39-86716e05a3ca" (UID: "14cb6ca6-352c-4817-9c39-86716e05a3ca"). InnerVolumeSpecName "kube-api-access-69brh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.740795 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data" (OuterVolumeSpecName: "config-data") pod "14cb6ca6-352c-4817-9c39-86716e05a3ca" (UID: "14cb6ca6-352c-4817-9c39-86716e05a3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.742788 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14cb6ca6-352c-4817-9c39-86716e05a3ca" (UID: "14cb6ca6-352c-4817-9c39-86716e05a3ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.819978 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69brh\" (UniqueName: \"kubernetes.io/projected/14cb6ca6-352c-4817-9c39-86716e05a3ca-kube-api-access-69brh\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.820027 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:27 crc kubenswrapper[4939]: I0318 17:12:27.820039 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cb6ca6-352c-4817-9c39-86716e05a3ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.025275 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.037112 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.045640 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:28 crc kubenswrapper[4939]: E0318 17:12:28.046046 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb6ca6-352c-4817-9c39-86716e05a3ca" containerName="nova-scheduler-scheduler" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.046070 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb6ca6-352c-4817-9c39-86716e05a3ca" containerName="nova-scheduler-scheduler" Mar 18 17:12:28 crc kubenswrapper[4939]: E0318 17:12:28.046103 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="init" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.046109 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="init" Mar 18 17:12:28 crc kubenswrapper[4939]: E0318 17:12:28.046116 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="dnsmasq-dns" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.046123 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="dnsmasq-dns" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.046280 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" containerName="dnsmasq-dns" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.046297 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb6ca6-352c-4817-9c39-86716e05a3ca" containerName="nova-scheduler-scheduler" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.047012 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.049250 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.059741 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.130267 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6t7v\" (UniqueName: \"kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.130440 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.130483 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.144231 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14cb6ca6-352c-4817-9c39-86716e05a3ca" path="/var/lib/kubelet/pods/14cb6ca6-352c-4817-9c39-86716e05a3ca/volumes" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.145038 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8231f7-455b-4a7f-98bb-2cde842a5332" path="/var/lib/kubelet/pods/da8231f7-455b-4a7f-98bb-2cde842a5332/volumes" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.232194 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.232263 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.232385 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6t7v\" (UniqueName: \"kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.238814 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.240408 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.252866 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6t7v\" (UniqueName: \"kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v\") pod \"nova-scheduler-0\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.367710 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:28 crc kubenswrapper[4939]: W0318 17:12:28.822950 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255b96f6_1bfb_4bb2_a526_568fb06acde9.slice/crio-fe1555c67d46b3c4063eaddd9a3071dac98005d7395e0b8c385d62805a4c5d0f WatchSource:0}: Error finding container fe1555c67d46b3c4063eaddd9a3071dac98005d7395e0b8c385d62805a4c5d0f: Status 404 returned error can't find the container with id fe1555c67d46b3c4063eaddd9a3071dac98005d7395e0b8c385d62805a4c5d0f Mar 18 17:12:28 crc kubenswrapper[4939]: I0318 17:12:28.827803 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:29 crc kubenswrapper[4939]: I0318 17:12:29.687957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"255b96f6-1bfb-4bb2-a526-568fb06acde9","Type":"ContainerStarted","Data":"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2"} Mar 18 17:12:29 crc kubenswrapper[4939]: I0318 17:12:29.688205 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"255b96f6-1bfb-4bb2-a526-568fb06acde9","Type":"ContainerStarted","Data":"fe1555c67d46b3c4063eaddd9a3071dac98005d7395e0b8c385d62805a4c5d0f"} Mar 18 17:12:29 crc kubenswrapper[4939]: I0318 17:12:29.709801 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.709778258 podStartE2EDuration="1.709778258s" podCreationTimestamp="2026-03-18 17:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:29.700925757 +0000 UTC m=+5714.300113378" watchObservedRunningTime="2026-03-18 17:12:29.709778258 +0000 UTC m=+5714.308965879" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.016269 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.528593 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dx4x4"] Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.529724 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.537907 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.541099 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.548857 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx4x4"] Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.618986 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b62h\" (UniqueName: \"kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.619060 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.619142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.619250 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.720755 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.720830 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b62h\" (UniqueName: \"kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.720855 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.720899 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.733472 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.743630 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.746289 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.748790 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b62h\" (UniqueName: \"kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h\") pod \"nova-cell1-cell-mapping-dx4x4\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:32 crc kubenswrapper[4939]: I0318 17:12:32.858445 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:33 crc kubenswrapper[4939]: I0318 17:12:33.325826 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx4x4"] Mar 18 17:12:33 crc kubenswrapper[4939]: I0318 17:12:33.367961 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 17:12:33 crc kubenswrapper[4939]: I0318 17:12:33.733101 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx4x4" event={"ID":"2666dc14-55b4-4030-b812-73719a709183","Type":"ContainerStarted","Data":"27f7b58c8666127dc76b1b59e6c5d59fbd23b611afad63942f31d9cbb2390a75"} Mar 18 17:12:33 crc kubenswrapper[4939]: I0318 17:12:33.733157 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx4x4" event={"ID":"2666dc14-55b4-4030-b812-73719a709183","Type":"ContainerStarted","Data":"4b0f0e32d67d09f0d8904e4ff4b4c88182674a4e946dff886cafb46c464cd129"} Mar 18 17:12:34 crc kubenswrapper[4939]: I0318 17:12:34.028526 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:12:34 crc kubenswrapper[4939]: I0318 17:12:34.028836 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:12:34 crc kubenswrapper[4939]: I0318 17:12:34.329849 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:12:34 crc kubenswrapper[4939]: I0318 17:12:34.329928 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:12:35 crc kubenswrapper[4939]: I0318 17:12:35.110845 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.106:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:35 crc kubenswrapper[4939]: I0318 17:12:35.110854 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.106:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:35 crc kubenswrapper[4939]: I0318 17:12:35.413719 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.107:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:35 crc kubenswrapper[4939]: I0318 17:12:35.413719 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.107:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.367871 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.404852 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.425266 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dx4x4" podStartSLOduration=6.425249044 podStartE2EDuration="6.425249044s" podCreationTimestamp="2026-03-18 17:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:33.756250918 +0000 UTC m=+5718.355438629" watchObservedRunningTime="2026-03-18 17:12:38.425249044 +0000 UTC m=+5723.024436665" Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.806043 4939 generic.go:334] "Generic (PLEG): container finished" podID="2666dc14-55b4-4030-b812-73719a709183" containerID="27f7b58c8666127dc76b1b59e6c5d59fbd23b611afad63942f31d9cbb2390a75" exitCode=0 Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.806184 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx4x4" event={"ID":"2666dc14-55b4-4030-b812-73719a709183","Type":"ContainerDied","Data":"27f7b58c8666127dc76b1b59e6c5d59fbd23b611afad63942f31d9cbb2390a75"} Mar 18 17:12:38 crc kubenswrapper[4939]: I0318 17:12:38.833530 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.136713 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.289206 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts\") pod \"2666dc14-55b4-4030-b812-73719a709183\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.290669 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data\") pod \"2666dc14-55b4-4030-b812-73719a709183\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.291118 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle\") pod \"2666dc14-55b4-4030-b812-73719a709183\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.291227 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b62h\" (UniqueName: \"kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h\") pod \"2666dc14-55b4-4030-b812-73719a709183\" (UID: \"2666dc14-55b4-4030-b812-73719a709183\") " Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.295572 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts" (OuterVolumeSpecName: "scripts") pod "2666dc14-55b4-4030-b812-73719a709183" (UID: "2666dc14-55b4-4030-b812-73719a709183"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.296535 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h" (OuterVolumeSpecName: "kube-api-access-9b62h") pod "2666dc14-55b4-4030-b812-73719a709183" (UID: "2666dc14-55b4-4030-b812-73719a709183"). InnerVolumeSpecName "kube-api-access-9b62h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.314638 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2666dc14-55b4-4030-b812-73719a709183" (UID: "2666dc14-55b4-4030-b812-73719a709183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.321910 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data" (OuterVolumeSpecName: "config-data") pod "2666dc14-55b4-4030-b812-73719a709183" (UID: "2666dc14-55b4-4030-b812-73719a709183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.392286 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.392317 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b62h\" (UniqueName: \"kubernetes.io/projected/2666dc14-55b4-4030-b812-73719a709183-kube-api-access-9b62h\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.392327 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.392336 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2666dc14-55b4-4030-b812-73719a709183-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.826737 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dx4x4" event={"ID":"2666dc14-55b4-4030-b812-73719a709183","Type":"ContainerDied","Data":"4b0f0e32d67d09f0d8904e4ff4b4c88182674a4e946dff886cafb46c464cd129"} Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.826793 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b0f0e32d67d09f0d8904e4ff4b4c88182674a4e946dff886cafb46c464cd129" Mar 18 17:12:40 crc kubenswrapper[4939]: I0318 17:12:40.826808 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dx4x4" Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.125702 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.126056 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-log" containerID="cri-o://a715d137a31b817114edd3944436bdcbf6bf8adbd01e1e7b0816b22f4c4c0081" gracePeriod=30 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.126350 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-api" containerID="cri-o://b9624f9d49d281d16dae1ce675043303b4fa0dd0773220e5d20a515eb36aaf9a" gracePeriod=30 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.182833 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.184452 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerName="nova-scheduler-scheduler" containerID="cri-o://f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" gracePeriod=30 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.230259 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.230491 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-log" containerID="cri-o://9e62d0fd4c5ea27e4b9cc2450588900212ef6a97320fbdd6e3a90b3c4ab43f1a" gracePeriod=30 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.230969 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-metadata" containerID="cri-o://1baa94798381419bbad43b444e9c76882d3b6f9a89c5caa688c81513fec773e7" gracePeriod=30 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.836323 4939 generic.go:334] "Generic (PLEG): container finished" podID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerID="a715d137a31b817114edd3944436bdcbf6bf8adbd01e1e7b0816b22f4c4c0081" exitCode=143 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.836404 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerDied","Data":"a715d137a31b817114edd3944436bdcbf6bf8adbd01e1e7b0816b22f4c4c0081"} Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.839154 4939 generic.go:334] "Generic (PLEG): container finished" podID="53ca19bf-693e-4435-b356-0d094d179f91" containerID="9e62d0fd4c5ea27e4b9cc2450588900212ef6a97320fbdd6e3a90b3c4ab43f1a" exitCode=143 Mar 18 17:12:41 crc kubenswrapper[4939]: I0318 17:12:41.839184 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerDied","Data":"9e62d0fd4c5ea27e4b9cc2450588900212ef6a97320fbdd6e3a90b3c4ab43f1a"} Mar 18 17:12:42 crc kubenswrapper[4939]: I0318 17:12:42.028792 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:12:42 crc kubenswrapper[4939]: I0318 17:12:42.028848 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:12:42 crc kubenswrapper[4939]: I0318 17:12:42.329817 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:12:42 crc kubenswrapper[4939]: I0318 17:12:42.330064 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:12:43 crc kubenswrapper[4939]: E0318 17:12:43.370254 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 17:12:43 crc kubenswrapper[4939]: E0318 17:12:43.373106 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 17:12:43 crc kubenswrapper[4939]: E0318 17:12:43.374631 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 17:12:43 crc kubenswrapper[4939]: E0318 17:12:43.374810 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerName="nova-scheduler-scheduler" Mar 18 17:12:44 crc kubenswrapper[4939]: I0318 17:12:44.870987 4939 generic.go:334] "Generic (PLEG): container finished" podID="53ca19bf-693e-4435-b356-0d094d179f91" containerID="1baa94798381419bbad43b444e9c76882d3b6f9a89c5caa688c81513fec773e7" exitCode=0 Mar 18 17:12:44 crc kubenswrapper[4939]: I0318 17:12:44.871064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerDied","Data":"1baa94798381419bbad43b444e9c76882d3b6f9a89c5caa688c81513fec773e7"} Mar 18 17:12:44 crc kubenswrapper[4939]: I0318 17:12:44.873220 4939 generic.go:334] "Generic (PLEG): container finished" podID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerID="b9624f9d49d281d16dae1ce675043303b4fa0dd0773220e5d20a515eb36aaf9a" exitCode=0 Mar 18 17:12:44 crc kubenswrapper[4939]: I0318 17:12:44.873246 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerDied","Data":"b9624f9d49d281d16dae1ce675043303b4fa0dd0773220e5d20a515eb36aaf9a"} Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.111675 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.117460 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.286909 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data\") pod \"53ca19bf-693e-4435-b356-0d094d179f91\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287087 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle\") pod \"7cf92730-058c-4d87-a615-e6cd554f88ca\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287159 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5grt6\" (UniqueName: \"kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6\") pod \"7cf92730-058c-4d87-a615-e6cd554f88ca\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287205 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs\") pod \"53ca19bf-693e-4435-b356-0d094d179f91\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287242 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle\") pod \"53ca19bf-693e-4435-b356-0d094d179f91\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287283 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data\") pod \"7cf92730-058c-4d87-a615-e6cd554f88ca\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287325 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4vqg\" (UniqueName: \"kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg\") pod \"53ca19bf-693e-4435-b356-0d094d179f91\" (UID: \"53ca19bf-693e-4435-b356-0d094d179f91\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs\") pod \"7cf92730-058c-4d87-a615-e6cd554f88ca\" (UID: \"7cf92730-058c-4d87-a615-e6cd554f88ca\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287691 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs" (OuterVolumeSpecName: "logs") pod "53ca19bf-693e-4435-b356-0d094d179f91" (UID: "53ca19bf-693e-4435-b356-0d094d179f91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.287837 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ca19bf-693e-4435-b356-0d094d179f91-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.288739 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs" (OuterVolumeSpecName: "logs") pod "7cf92730-058c-4d87-a615-e6cd554f88ca" (UID: "7cf92730-058c-4d87-a615-e6cd554f88ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.293958 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg" (OuterVolumeSpecName: "kube-api-access-w4vqg") pod "53ca19bf-693e-4435-b356-0d094d179f91" (UID: "53ca19bf-693e-4435-b356-0d094d179f91"). InnerVolumeSpecName "kube-api-access-w4vqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.294670 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6" (OuterVolumeSpecName: "kube-api-access-5grt6") pod "7cf92730-058c-4d87-a615-e6cd554f88ca" (UID: "7cf92730-058c-4d87-a615-e6cd554f88ca"). InnerVolumeSpecName "kube-api-access-5grt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.324339 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53ca19bf-693e-4435-b356-0d094d179f91" (UID: "53ca19bf-693e-4435-b356-0d094d179f91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.330589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data" (OuterVolumeSpecName: "config-data") pod "7cf92730-058c-4d87-a615-e6cd554f88ca" (UID: "7cf92730-058c-4d87-a615-e6cd554f88ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.334723 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data" (OuterVolumeSpecName: "config-data") pod "53ca19bf-693e-4435-b356-0d094d179f91" (UID: "53ca19bf-693e-4435-b356-0d094d179f91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.374679 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cf92730-058c-4d87-a615-e6cd554f88ca" (UID: "7cf92730-058c-4d87-a615-e6cd554f88ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389323 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389379 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389395 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4vqg\" (UniqueName: \"kubernetes.io/projected/53ca19bf-693e-4435-b356-0d094d179f91-kube-api-access-w4vqg\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389406 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf92730-058c-4d87-a615-e6cd554f88ca-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389417 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ca19bf-693e-4435-b356-0d094d179f91-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389428 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf92730-058c-4d87-a615-e6cd554f88ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.389438 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5grt6\" (UniqueName: \"kubernetes.io/projected/7cf92730-058c-4d87-a615-e6cd554f88ca-kube-api-access-5grt6\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.703117 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.882910 4939 generic.go:334] "Generic (PLEG): container finished" podID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" exitCode=0 Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.882982 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"255b96f6-1bfb-4bb2-a526-568fb06acde9","Type":"ContainerDied","Data":"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2"} Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.882985 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.883010 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"255b96f6-1bfb-4bb2-a526-568fb06acde9","Type":"ContainerDied","Data":"fe1555c67d46b3c4063eaddd9a3071dac98005d7395e0b8c385d62805a4c5d0f"} Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.883028 4939 scope.go:117] "RemoveContainer" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.885093 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.886972 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf92730-058c-4d87-a615-e6cd554f88ca","Type":"ContainerDied","Data":"b68d5acfe1df63d9b96fa972868d0f54c2525fabee2848d583bbe7c6231a3f63"} Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.892734 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53ca19bf-693e-4435-b356-0d094d179f91","Type":"ContainerDied","Data":"8e9385558e4324ac4eb3c30d45680b8713e17fe9c7db7c7173f9126facbe3913"} Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.892778 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.899434 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle\") pod \"255b96f6-1bfb-4bb2-a526-568fb06acde9\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.899625 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6t7v\" (UniqueName: \"kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v\") pod \"255b96f6-1bfb-4bb2-a526-568fb06acde9\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.899648 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data\") pod \"255b96f6-1bfb-4bb2-a526-568fb06acde9\" (UID: \"255b96f6-1bfb-4bb2-a526-568fb06acde9\") " Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.904270 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v" (OuterVolumeSpecName: "kube-api-access-l6t7v") pod "255b96f6-1bfb-4bb2-a526-568fb06acde9" (UID: "255b96f6-1bfb-4bb2-a526-568fb06acde9"). InnerVolumeSpecName "kube-api-access-l6t7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.905405 4939 scope.go:117] "RemoveContainer" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" Mar 18 17:12:45 crc kubenswrapper[4939]: E0318 17:12:45.905821 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2\": container with ID starting with f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2 not found: ID does not exist" containerID="f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.905856 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2"} err="failed to get container status \"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2\": rpc error: code = NotFound desc = could not find container \"f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2\": container with ID starting with f4e3159aa35e1faf517b49cd5f85ec432f836681c3cba31c839aa494a9268fb2 not found: ID does not exist" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.905883 4939 scope.go:117] "RemoveContainer" containerID="b9624f9d49d281d16dae1ce675043303b4fa0dd0773220e5d20a515eb36aaf9a" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.926116 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data" (OuterVolumeSpecName: "config-data") pod "255b96f6-1bfb-4bb2-a526-568fb06acde9" (UID: "255b96f6-1bfb-4bb2-a526-568fb06acde9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:45 crc kubenswrapper[4939]: I0318 17:12:45.926774 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "255b96f6-1bfb-4bb2-a526-568fb06acde9" (UID: "255b96f6-1bfb-4bb2-a526-568fb06acde9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.002467 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6t7v\" (UniqueName: \"kubernetes.io/projected/255b96f6-1bfb-4bb2-a526-568fb06acde9-kube-api-access-l6t7v\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.002612 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.002634 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255b96f6-1bfb-4bb2-a526-568fb06acde9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.031910 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.036471 4939 scope.go:117] "RemoveContainer" containerID="a715d137a31b817114edd3944436bdcbf6bf8adbd01e1e7b0816b22f4c4c0081" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.051637 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.079361 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.095013 4939 scope.go:117] "RemoveContainer" containerID="1baa94798381419bbad43b444e9c76882d3b6f9a89c5caa688c81513fec773e7" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.123538 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.127035 4939 scope.go:117] "RemoveContainer" containerID="9e62d0fd4c5ea27e4b9cc2450588900212ef6a97320fbdd6e3a90b3c4ab43f1a" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.149710 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ca19bf-693e-4435-b356-0d094d179f91" path="/var/lib/kubelet/pods/53ca19bf-693e-4435-b356-0d094d179f91/volumes" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.150311 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" path="/var/lib/kubelet/pods/7cf92730-058c-4d87-a615-e6cd554f88ca/volumes" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151032 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151296 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-log" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151313 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-log" Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151331 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-api" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151337 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-api" Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151347 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerName="nova-scheduler-scheduler" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151352 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerName="nova-scheduler-scheduler" Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151364 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-log" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151370 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-log" Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151379 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2666dc14-55b4-4030-b812-73719a709183" containerName="nova-manage" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151384 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2666dc14-55b4-4030-b812-73719a709183" containerName="nova-manage" Mar 18 17:12:46 crc kubenswrapper[4939]: E0318 17:12:46.151407 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-metadata" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151413 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-metadata" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151585 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-metadata" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151603 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-api" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151610 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" containerName="nova-scheduler-scheduler" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151618 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf92730-058c-4d87-a615-e6cd554f88ca" containerName="nova-api-log" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151630 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2666dc14-55b4-4030-b812-73719a709183" containerName="nova-manage" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.151643 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ca19bf-693e-4435-b356-0d094d179f91" containerName="nova-metadata-log" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.157824 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.157869 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.158071 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.159968 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.160063 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.160565 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.163524 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.228152 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.239626 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.249786 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.251338 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.258143 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.260780 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.307781 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.307846 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzdt\" (UniqueName: \"kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.307872 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.307895 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.308176 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.308227 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.308286 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.308438 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4qlg\" (UniqueName: \"kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410189 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxpw9\" (UniqueName: \"kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410341 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410469 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzdt\" (UniqueName: \"kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410498 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410554 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410613 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410637 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410800 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410825 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410829 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410905 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.410953 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4qlg\" (UniqueName: \"kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.411593 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.413697 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.414155 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.415363 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.415840 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.426591 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzdt\" (UniqueName: \"kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt\") pod \"nova-metadata-0\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.428557 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4qlg\" (UniqueName: \"kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg\") pod \"nova-api-0\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.488619 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.501851 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.512060 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxpw9\" (UniqueName: \"kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.512149 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.512177 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.515493 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.516171 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.532811 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxpw9\" (UniqueName: \"kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9\") pod \"nova-scheduler-0\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.567029 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:12:46 crc kubenswrapper[4939]: I0318 17:12:46.977200 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:12:46 crc kubenswrapper[4939]: W0318 17:12:46.978040 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d9258f_7237_41ad_bee3_dcdbf6056b35.slice/crio-6c6eece8c7f4ee12f5816fe37253cabcd012b61909e5d3231463a5b3f7282936 WatchSource:0}: Error finding container 6c6eece8c7f4ee12f5816fe37253cabcd012b61909e5d3231463a5b3f7282936: Status 404 returned error can't find the container with id 6c6eece8c7f4ee12f5816fe37253cabcd012b61909e5d3231463a5b3f7282936 Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.044022 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:12:47 crc kubenswrapper[4939]: W0318 17:12:47.056465 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9518de39_97b8_419b_9ddf_f8c052c695d5.slice/crio-41bd276f662d800c85e1e74a1e06b54940d4883828cbf7b6538446cc1dd0965b WatchSource:0}: Error finding container 41bd276f662d800c85e1e74a1e06b54940d4883828cbf7b6538446cc1dd0965b: Status 404 returned error can't find the container with id 41bd276f662d800c85e1e74a1e06b54940d4883828cbf7b6538446cc1dd0965b Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.131405 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:12:47 crc kubenswrapper[4939]: W0318 17:12:47.133148 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa3e625d_ea3f_422e_b4d1_3967a61bb0d6.slice/crio-c9b49715b920173953669f186f7f5be3175db31848dfa5cedf9e0c65ce627c93 WatchSource:0}: Error finding container c9b49715b920173953669f186f7f5be3175db31848dfa5cedf9e0c65ce627c93: Status 404 returned error can't find the container with id c9b49715b920173953669f186f7f5be3175db31848dfa5cedf9e0c65ce627c93 Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.920276 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerStarted","Data":"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.920642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerStarted","Data":"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.920659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerStarted","Data":"41bd276f662d800c85e1e74a1e06b54940d4883828cbf7b6538446cc1dd0965b"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.921570 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6","Type":"ContainerStarted","Data":"19d5e890462ad4fc1ea3ff511c7e01c77cad68264d5221a83927a56e1e2fdd37"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.921602 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6","Type":"ContainerStarted","Data":"c9b49715b920173953669f186f7f5be3175db31848dfa5cedf9e0c65ce627c93"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.923092 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerStarted","Data":"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.923236 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerStarted","Data":"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.923335 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerStarted","Data":"6c6eece8c7f4ee12f5816fe37253cabcd012b61909e5d3231463a5b3f7282936"} Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.945246 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.945223002 podStartE2EDuration="1.945223002s" podCreationTimestamp="2026-03-18 17:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:47.937474562 +0000 UTC m=+5732.536662183" watchObservedRunningTime="2026-03-18 17:12:47.945223002 +0000 UTC m=+5732.544410623" Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.963019 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.963001445 podStartE2EDuration="1.963001445s" podCreationTimestamp="2026-03-18 17:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:47.959150206 +0000 UTC m=+5732.558337857" watchObservedRunningTime="2026-03-18 17:12:47.963001445 +0000 UTC m=+5732.562189056" Mar 18 17:12:47 crc kubenswrapper[4939]: I0318 17:12:47.976921 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9769033 podStartE2EDuration="1.9769033s" podCreationTimestamp="2026-03-18 17:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:47.974057709 +0000 UTC m=+5732.573245330" watchObservedRunningTime="2026-03-18 17:12:47.9769033 +0000 UTC m=+5732.576090921" Mar 18 17:12:48 crc kubenswrapper[4939]: I0318 17:12:48.145202 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255b96f6-1bfb-4bb2-a526-568fb06acde9" path="/var/lib/kubelet/pods/255b96f6-1bfb-4bb2-a526-568fb06acde9/volumes" Mar 18 17:12:51 crc kubenswrapper[4939]: I0318 17:12:51.568066 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.687154 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.687444 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.687488 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.688177 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.688231 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1" gracePeriod=600 Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.991587 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1" exitCode=0 Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.991662 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1"} Mar 18 17:12:53 crc kubenswrapper[4939]: I0318 17:12:53.991905 4939 scope.go:117] "RemoveContainer" containerID="22a12f07d0a036db7c9a9025657dcb9a7f7c950d2062e5ee3c3f0f48d952085c" Mar 18 17:12:55 crc kubenswrapper[4939]: I0318 17:12:55.007599 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e"} Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.489593 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.489938 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.502765 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.502842 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.568362 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 17:12:56 crc kubenswrapper[4939]: I0318 17:12:56.599224 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 17:12:57 crc kubenswrapper[4939]: I0318 17:12:57.049864 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 17:12:57 crc kubenswrapper[4939]: I0318 17:12:57.653749 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.110:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:57 crc kubenswrapper[4939]: I0318 17:12:57.653809 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.110:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:57 crc kubenswrapper[4939]: I0318 17:12:57.653869 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.111:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:12:57 crc kubenswrapper[4939]: I0318 17:12:57.653766 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.111:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:13:01 crc kubenswrapper[4939]: I0318 17:13:01.439674 4939 scope.go:117] "RemoveContainer" containerID="0e5d1e643048bffb2d44c4de9d7b8849146978a3e1d3724c105188668e7c84af" Mar 18 17:13:01 crc kubenswrapper[4939]: I0318 17:13:01.478514 4939 scope.go:117] "RemoveContainer" containerID="fb119c65e6691f9abdfd0382948b6f871e3393574cf4c4365f4e1be111d4bf99" Mar 18 17:13:04 crc kubenswrapper[4939]: I0318 17:13:04.488774 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:13:04 crc kubenswrapper[4939]: I0318 17:13:04.489431 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:13:04 crc kubenswrapper[4939]: I0318 17:13:04.503048 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:13:04 crc kubenswrapper[4939]: I0318 17:13:04.503120 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.494633 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.495843 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.500587 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.505139 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.505278 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.507647 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 17:13:06 crc kubenswrapper[4939]: I0318 17:13:06.507866 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.154485 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.366254 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.367772 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.392107 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.565856 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.565930 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.566026 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.566069 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49g5t\" (UniqueName: \"kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.566109 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.668529 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.668653 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.668684 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49g5t\" (UniqueName: \"kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.668703 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.668718 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.669475 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.669975 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.670463 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.671220 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.697299 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49g5t\" (UniqueName: \"kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t\") pod \"dnsmasq-dns-57958c8f89-pgz2z\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:08 crc kubenswrapper[4939]: I0318 17:13:08.697709 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:09 crc kubenswrapper[4939]: W0318 17:13:09.228856 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20c77842_c27d_4a84_b24e_8cdc01f16920.slice/crio-2225897e71b12d4fc9c7f92fa50b9ace53b1d3571cfce0ff8516ae0e34626413 WatchSource:0}: Error finding container 2225897e71b12d4fc9c7f92fa50b9ace53b1d3571cfce0ff8516ae0e34626413: Status 404 returned error can't find the container with id 2225897e71b12d4fc9c7f92fa50b9ace53b1d3571cfce0ff8516ae0e34626413 Mar 18 17:13:09 crc kubenswrapper[4939]: I0318 17:13:09.230304 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:10 crc kubenswrapper[4939]: I0318 17:13:10.169307 4939 generic.go:334] "Generic (PLEG): container finished" podID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerID="1f8c7ccc7babcae303b6fd4a46b65851b8735261842d6bf7a1e5bd1c894848b5" exitCode=0 Mar 18 17:13:10 crc kubenswrapper[4939]: I0318 17:13:10.169451 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" event={"ID":"20c77842-c27d-4a84-b24e-8cdc01f16920","Type":"ContainerDied","Data":"1f8c7ccc7babcae303b6fd4a46b65851b8735261842d6bf7a1e5bd1c894848b5"} Mar 18 17:13:10 crc kubenswrapper[4939]: I0318 17:13:10.169900 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" event={"ID":"20c77842-c27d-4a84-b24e-8cdc01f16920","Type":"ContainerStarted","Data":"2225897e71b12d4fc9c7f92fa50b9ace53b1d3571cfce0ff8516ae0e34626413"} Mar 18 17:13:11 crc kubenswrapper[4939]: I0318 17:13:11.182126 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" event={"ID":"20c77842-c27d-4a84-b24e-8cdc01f16920","Type":"ContainerStarted","Data":"0bb4c7b32a3669beebea3c30ccce699b979889dedd9fa2732b675c8b5932fe6e"} Mar 18 17:13:11 crc kubenswrapper[4939]: I0318 17:13:11.183689 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:11 crc kubenswrapper[4939]: I0318 17:13:11.205179 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" podStartSLOduration=3.205158709 podStartE2EDuration="3.205158709s" podCreationTimestamp="2026-03-18 17:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:11.203611135 +0000 UTC m=+5755.802798756" watchObservedRunningTime="2026-03-18 17:13:11.205158709 +0000 UTC m=+5755.804346330" Mar 18 17:13:18 crc kubenswrapper[4939]: I0318 17:13:18.699687 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:18 crc kubenswrapper[4939]: I0318 17:13:18.786782 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:13:18 crc kubenswrapper[4939]: I0318 17:13:18.787104 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="dnsmasq-dns" containerID="cri-o://d298f32db5b93b53a10bbc4f67e3b805a3468d18c812b237dc30b4d7d53ff443" gracePeriod=10 Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.268821 4939 generic.go:334] "Generic (PLEG): container finished" podID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerID="d298f32db5b93b53a10bbc4f67e3b805a3468d18c812b237dc30b4d7d53ff443" exitCode=0 Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.268936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerDied","Data":"d298f32db5b93b53a10bbc4f67e3b805a3468d18c812b237dc30b4d7d53ff443"} Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.269157 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" event={"ID":"1472528d-7a5a-497d-9413-6692a3f01ccd","Type":"ContainerDied","Data":"fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb"} Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.269172 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5c55c12606b96cb4d29bfc0f19860c06acbadd1ed5b811024ed9b176ac76cb" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.269231 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.393736 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config\") pod \"1472528d-7a5a-497d-9413-6692a3f01ccd\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.393804 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm88m\" (UniqueName: \"kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m\") pod \"1472528d-7a5a-497d-9413-6692a3f01ccd\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.393870 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb\") pod \"1472528d-7a5a-497d-9413-6692a3f01ccd\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.393943 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb\") pod \"1472528d-7a5a-497d-9413-6692a3f01ccd\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.394031 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc\") pod \"1472528d-7a5a-497d-9413-6692a3f01ccd\" (UID: \"1472528d-7a5a-497d-9413-6692a3f01ccd\") " Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.400095 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m" (OuterVolumeSpecName: "kube-api-access-mm88m") pod "1472528d-7a5a-497d-9413-6692a3f01ccd" (UID: "1472528d-7a5a-497d-9413-6692a3f01ccd"). InnerVolumeSpecName "kube-api-access-mm88m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.445792 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1472528d-7a5a-497d-9413-6692a3f01ccd" (UID: "1472528d-7a5a-497d-9413-6692a3f01ccd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.452423 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config" (OuterVolumeSpecName: "config") pod "1472528d-7a5a-497d-9413-6692a3f01ccd" (UID: "1472528d-7a5a-497d-9413-6692a3f01ccd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.463724 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1472528d-7a5a-497d-9413-6692a3f01ccd" (UID: "1472528d-7a5a-497d-9413-6692a3f01ccd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.476057 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1472528d-7a5a-497d-9413-6692a3f01ccd" (UID: "1472528d-7a5a-497d-9413-6692a3f01ccd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.495939 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.495972 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.495987 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm88m\" (UniqueName: \"kubernetes.io/projected/1472528d-7a5a-497d-9413-6692a3f01ccd-kube-api-access-mm88m\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.496000 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:19 crc kubenswrapper[4939]: I0318 17:13:19.496011 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1472528d-7a5a-497d-9413-6692a3f01ccd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.278541 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-zzdlk" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.305960 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.318761 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-zzdlk"] Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.573374 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xtc45"] Mar 18 17:13:20 crc kubenswrapper[4939]: E0318 17:13:20.573735 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="dnsmasq-dns" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.573747 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="dnsmasq-dns" Mar 18 17:13:20 crc kubenswrapper[4939]: E0318 17:13:20.573777 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="init" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.573784 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="init" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.573968 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" containerName="dnsmasq-dns" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.574495 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.595440 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xtc45"] Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.677628 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3146-account-create-update-4r9mf"] Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.678904 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.681678 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.683600 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3146-account-create-update-4r9mf"] Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.718637 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.718765 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8sb5\" (UniqueName: \"kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.820612 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.820850 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.821015 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8sb5\" (UniqueName: \"kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.821121 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtc5\" (UniqueName: \"kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.821873 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.841439 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8sb5\" (UniqueName: \"kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5\") pod \"cinder-db-create-xtc45\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.892969 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.922944 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtc5\" (UniqueName: \"kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.923326 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.924076 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.945423 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtc5\" (UniqueName: \"kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5\") pod \"cinder-3146-account-create-update-4r9mf\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:20 crc kubenswrapper[4939]: I0318 17:13:20.998424 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:21 crc kubenswrapper[4939]: I0318 17:13:21.349441 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xtc45"] Mar 18 17:13:21 crc kubenswrapper[4939]: I0318 17:13:21.512086 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3146-account-create-update-4r9mf"] Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.158980 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1472528d-7a5a-497d-9413-6692a3f01ccd" path="/var/lib/kubelet/pods/1472528d-7a5a-497d-9413-6692a3f01ccd/volumes" Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.312802 4939 generic.go:334] "Generic (PLEG): container finished" podID="ba7d0043-4f48-4076-a59b-eaaa2bf6006e" containerID="7923719ff9f60dcf746a168fb84140e39f7215ef92dc9c77fafc58aac43b54a9" exitCode=0 Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.312901 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xtc45" event={"ID":"ba7d0043-4f48-4076-a59b-eaaa2bf6006e","Type":"ContainerDied","Data":"7923719ff9f60dcf746a168fb84140e39f7215ef92dc9c77fafc58aac43b54a9"} Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.312939 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xtc45" event={"ID":"ba7d0043-4f48-4076-a59b-eaaa2bf6006e","Type":"ContainerStarted","Data":"b1150ca002646760c5f40945291f6d0c4ad83131505a92eb83954621b7c80639"} Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.316600 4939 generic.go:334] "Generic (PLEG): container finished" podID="f68a3156-e5ae-4a0f-94b3-93c16bde7cca" containerID="1bfb17ad01275f9d0be2fbc08c33ce556e0d5c10b5c83ec2f38e1350cefc5665" exitCode=0 Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.316657 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3146-account-create-update-4r9mf" event={"ID":"f68a3156-e5ae-4a0f-94b3-93c16bde7cca","Type":"ContainerDied","Data":"1bfb17ad01275f9d0be2fbc08c33ce556e0d5c10b5c83ec2f38e1350cefc5665"} Mar 18 17:13:22 crc kubenswrapper[4939]: I0318 17:13:22.316689 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3146-account-create-update-4r9mf" event={"ID":"f68a3156-e5ae-4a0f-94b3-93c16bde7cca","Type":"ContainerStarted","Data":"e35f77552a7630ad652c0f1c87af7d326fcf7df7c11eb09106ef5a5ec9302a1f"} Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.737667 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.784384 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.878237 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8sb5\" (UniqueName: \"kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5\") pod \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.878407 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts\") pod \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\" (UID: \"ba7d0043-4f48-4076-a59b-eaaa2bf6006e\") " Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.878903 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7d0043-4f48-4076-a59b-eaaa2bf6006e" (UID: "ba7d0043-4f48-4076-a59b-eaaa2bf6006e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.883711 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5" (OuterVolumeSpecName: "kube-api-access-d8sb5") pod "ba7d0043-4f48-4076-a59b-eaaa2bf6006e" (UID: "ba7d0043-4f48-4076-a59b-eaaa2bf6006e"). InnerVolumeSpecName "kube-api-access-d8sb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.979681 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts\") pod \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.979938 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtc5\" (UniqueName: \"kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5\") pod \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\" (UID: \"f68a3156-e5ae-4a0f-94b3-93c16bde7cca\") " Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.980886 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.980936 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8sb5\" (UniqueName: \"kubernetes.io/projected/ba7d0043-4f48-4076-a59b-eaaa2bf6006e-kube-api-access-d8sb5\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.981336 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f68a3156-e5ae-4a0f-94b3-93c16bde7cca" (UID: "f68a3156-e5ae-4a0f-94b3-93c16bde7cca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:23 crc kubenswrapper[4939]: I0318 17:13:23.986093 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5" (OuterVolumeSpecName: "kube-api-access-pmtc5") pod "f68a3156-e5ae-4a0f-94b3-93c16bde7cca" (UID: "f68a3156-e5ae-4a0f-94b3-93c16bde7cca"). InnerVolumeSpecName "kube-api-access-pmtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.082718 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtc5\" (UniqueName: \"kubernetes.io/projected/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-kube-api-access-pmtc5\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.082757 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68a3156-e5ae-4a0f-94b3-93c16bde7cca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.337613 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3146-account-create-update-4r9mf" event={"ID":"f68a3156-e5ae-4a0f-94b3-93c16bde7cca","Type":"ContainerDied","Data":"e35f77552a7630ad652c0f1c87af7d326fcf7df7c11eb09106ef5a5ec9302a1f"} Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.337687 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e35f77552a7630ad652c0f1c87af7d326fcf7df7c11eb09106ef5a5ec9302a1f" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.337632 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3146-account-create-update-4r9mf" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.339539 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xtc45" event={"ID":"ba7d0043-4f48-4076-a59b-eaaa2bf6006e","Type":"ContainerDied","Data":"b1150ca002646760c5f40945291f6d0c4ad83131505a92eb83954621b7c80639"} Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.339578 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1150ca002646760c5f40945291f6d0c4ad83131505a92eb83954621b7c80639" Mar 18 17:13:24 crc kubenswrapper[4939]: I0318 17:13:24.339727 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xtc45" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.978432 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-q6dxn"] Mar 18 17:13:25 crc kubenswrapper[4939]: E0318 17:13:25.979025 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7d0043-4f48-4076-a59b-eaaa2bf6006e" containerName="mariadb-database-create" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.979044 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7d0043-4f48-4076-a59b-eaaa2bf6006e" containerName="mariadb-database-create" Mar 18 17:13:25 crc kubenswrapper[4939]: E0318 17:13:25.979067 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68a3156-e5ae-4a0f-94b3-93c16bde7cca" containerName="mariadb-account-create-update" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.979076 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68a3156-e5ae-4a0f-94b3-93c16bde7cca" containerName="mariadb-account-create-update" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.979307 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7d0043-4f48-4076-a59b-eaaa2bf6006e" containerName="mariadb-database-create" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.979328 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68a3156-e5ae-4a0f-94b3-93c16bde7cca" containerName="mariadb-account-create-update" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.980150 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.982577 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x4x6g" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.983123 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.985556 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 17:13:25 crc kubenswrapper[4939]: I0318 17:13:25.988061 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q6dxn"] Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.021484 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.021649 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.021714 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.021738 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.021774 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqtn\" (UniqueName: \"kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.022008 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124217 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124477 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124531 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqtn\" (UniqueName: \"kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124571 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124598 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124664 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.124750 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.128908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.129401 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.129710 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.132155 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.142673 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqtn\" (UniqueName: \"kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn\") pod \"cinder-db-sync-q6dxn\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.302991 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:26 crc kubenswrapper[4939]: I0318 17:13:26.894814 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q6dxn"] Mar 18 17:13:27 crc kubenswrapper[4939]: I0318 17:13:27.369622 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6dxn" event={"ID":"e7f97cae-5340-48f6-9a5c-663d16f1746b","Type":"ContainerStarted","Data":"e4999345ac6fcad3a2dc1afb145f889560797ea554174537c3f85e50b6a21dda"} Mar 18 17:13:28 crc kubenswrapper[4939]: I0318 17:13:28.379598 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6dxn" event={"ID":"e7f97cae-5340-48f6-9a5c-663d16f1746b","Type":"ContainerStarted","Data":"ae6ad071f58af9dd704f774ce0a135cfeeda0778afb73956f6d111665323604e"} Mar 18 17:13:28 crc kubenswrapper[4939]: I0318 17:13:28.415382 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-q6dxn" podStartSLOduration=3.415353068 podStartE2EDuration="3.415353068s" podCreationTimestamp="2026-03-18 17:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:28.396192834 +0000 UTC m=+5772.995380465" watchObservedRunningTime="2026-03-18 17:13:28.415353068 +0000 UTC m=+5773.014540729" Mar 18 17:13:30 crc kubenswrapper[4939]: I0318 17:13:30.396436 4939 generic.go:334] "Generic (PLEG): container finished" podID="e7f97cae-5340-48f6-9a5c-663d16f1746b" containerID="ae6ad071f58af9dd704f774ce0a135cfeeda0778afb73956f6d111665323604e" exitCode=0 Mar 18 17:13:30 crc kubenswrapper[4939]: I0318 17:13:30.396486 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6dxn" event={"ID":"e7f97cae-5340-48f6-9a5c-663d16f1746b","Type":"ContainerDied","Data":"ae6ad071f58af9dd704f774ce0a135cfeeda0778afb73956f6d111665323604e"} Mar 18 17:13:31 crc kubenswrapper[4939]: I0318 17:13:31.867681 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031141 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031200 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031251 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzqtn\" (UniqueName: \"kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031469 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031500 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data\") pod \"e7f97cae-5340-48f6-9a5c-663d16f1746b\" (UID: \"e7f97cae-5340-48f6-9a5c-663d16f1746b\") " Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.031537 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.032497 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f97cae-5340-48f6-9a5c-663d16f1746b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.036543 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.038664 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts" (OuterVolumeSpecName: "scripts") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.040057 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn" (OuterVolumeSpecName: "kube-api-access-lzqtn") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "kube-api-access-lzqtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.064144 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.076128 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data" (OuterVolumeSpecName: "config-data") pod "e7f97cae-5340-48f6-9a5c-663d16f1746b" (UID: "e7f97cae-5340-48f6-9a5c-663d16f1746b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.143241 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.143277 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.143291 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzqtn\" (UniqueName: \"kubernetes.io/projected/e7f97cae-5340-48f6-9a5c-663d16f1746b-kube-api-access-lzqtn\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.143304 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.143319 4939 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e7f97cae-5340-48f6-9a5c-663d16f1746b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.422455 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6dxn" event={"ID":"e7f97cae-5340-48f6-9a5c-663d16f1746b","Type":"ContainerDied","Data":"e4999345ac6fcad3a2dc1afb145f889560797ea554174537c3f85e50b6a21dda"} Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.422518 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4999345ac6fcad3a2dc1afb145f889560797ea554174537c3f85e50b6a21dda" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.422564 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6dxn" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.818583 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:13:32 crc kubenswrapper[4939]: E0318 17:13:32.819143 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f97cae-5340-48f6-9a5c-663d16f1746b" containerName="cinder-db-sync" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.819209 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f97cae-5340-48f6-9a5c-663d16f1746b" containerName="cinder-db-sync" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.819430 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f97cae-5340-48f6-9a5c-663d16f1746b" containerName="cinder-db-sync" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.820391 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.841207 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.854162 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.859093 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ddfm\" (UniqueName: \"kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.859403 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.859665 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.859751 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.961797 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.961856 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ddfm\" (UniqueName: \"kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.961924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.961989 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.962011 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.962974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.963215 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.963547 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.964469 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:32 crc kubenswrapper[4939]: I0318 17:13:32.993974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ddfm\" (UniqueName: \"kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm\") pod \"dnsmasq-dns-6fd99f9cb5-bgw4m\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.029786 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.035595 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.039369 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x4x6g" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.039577 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.039752 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.039918 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.041204 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066232 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmhv\" (UniqueName: \"kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066316 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066385 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066406 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066485 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066526 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.066567 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.152192 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.170797 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.170841 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.170924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.170944 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.170973 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.171009 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmhv\" (UniqueName: \"kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.171027 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.171179 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.172583 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.175021 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.175966 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.176268 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.184436 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.187923 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmhv\" (UniqueName: \"kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv\") pod \"cinder-api-0\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.362477 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.654425 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:13:33 crc kubenswrapper[4939]: I0318 17:13:33.920350 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:13:34 crc kubenswrapper[4939]: I0318 17:13:34.457557 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerID="e99f588261f57c808a19354a45cb0b37dc480d3b7d3686ca738acd776f7504a1" exitCode=0 Mar 18 17:13:34 crc kubenswrapper[4939]: I0318 17:13:34.457608 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" event={"ID":"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430","Type":"ContainerDied","Data":"e99f588261f57c808a19354a45cb0b37dc480d3b7d3686ca738acd776f7504a1"} Mar 18 17:13:34 crc kubenswrapper[4939]: I0318 17:13:34.457890 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" event={"ID":"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430","Type":"ContainerStarted","Data":"94e6d1935c84bfda1a36aecca5de8566c5389a6029bc43206500c56d35e56f73"} Mar 18 17:13:34 crc kubenswrapper[4939]: I0318 17:13:34.464067 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerStarted","Data":"225a75057764e312d19d7d4ae9b51c682dc20ddf7ebfc1533ba2b8c0036ec5cc"} Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.473705 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" event={"ID":"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430","Type":"ContainerStarted","Data":"90970a2b79bba1907a80f9ebf7afc08ed943ddc14e5ac38ae20895156e686ea8"} Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.474671 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.477136 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerStarted","Data":"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79"} Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.477165 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerStarted","Data":"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9"} Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.477405 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.493489 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" podStartSLOduration=3.49346889 podStartE2EDuration="3.49346889s" podCreationTimestamp="2026-03-18 17:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:35.491627607 +0000 UTC m=+5780.090815228" watchObservedRunningTime="2026-03-18 17:13:35.49346889 +0000 UTC m=+5780.092656511" Mar 18 17:13:35 crc kubenswrapper[4939]: I0318 17:13:35.512812 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.512791388 podStartE2EDuration="2.512791388s" podCreationTimestamp="2026-03-18 17:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:35.510905114 +0000 UTC m=+5780.110092735" watchObservedRunningTime="2026-03-18 17:13:35.512791388 +0000 UTC m=+5780.111979009" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.153933 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.234531 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.234825 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="dnsmasq-dns" containerID="cri-o://0bb4c7b32a3669beebea3c30ccce699b979889dedd9fa2732b675c8b5932fe6e" gracePeriod=10 Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.577518 4939 generic.go:334] "Generic (PLEG): container finished" podID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerID="0bb4c7b32a3669beebea3c30ccce699b979889dedd9fa2732b675c8b5932fe6e" exitCode=0 Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.577791 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" event={"ID":"20c77842-c27d-4a84-b24e-8cdc01f16920","Type":"ContainerDied","Data":"0bb4c7b32a3669beebea3c30ccce699b979889dedd9fa2732b675c8b5932fe6e"} Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.719416 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.893331 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb\") pod \"20c77842-c27d-4a84-b24e-8cdc01f16920\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.893455 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config\") pod \"20c77842-c27d-4a84-b24e-8cdc01f16920\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.893562 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb\") pod \"20c77842-c27d-4a84-b24e-8cdc01f16920\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.893612 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49g5t\" (UniqueName: \"kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t\") pod \"20c77842-c27d-4a84-b24e-8cdc01f16920\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.893638 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc\") pod \"20c77842-c27d-4a84-b24e-8cdc01f16920\" (UID: \"20c77842-c27d-4a84-b24e-8cdc01f16920\") " Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.905748 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t" (OuterVolumeSpecName: "kube-api-access-49g5t") pod "20c77842-c27d-4a84-b24e-8cdc01f16920" (UID: "20c77842-c27d-4a84-b24e-8cdc01f16920"). InnerVolumeSpecName "kube-api-access-49g5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.936362 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20c77842-c27d-4a84-b24e-8cdc01f16920" (UID: "20c77842-c27d-4a84-b24e-8cdc01f16920"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.948179 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20c77842-c27d-4a84-b24e-8cdc01f16920" (UID: "20c77842-c27d-4a84-b24e-8cdc01f16920"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.949788 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20c77842-c27d-4a84-b24e-8cdc01f16920" (UID: "20c77842-c27d-4a84-b24e-8cdc01f16920"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.971176 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config" (OuterVolumeSpecName: "config") pod "20c77842-c27d-4a84-b24e-8cdc01f16920" (UID: "20c77842-c27d-4a84-b24e-8cdc01f16920"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.997191 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.997225 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.997239 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49g5t\" (UniqueName: \"kubernetes.io/projected/20c77842-c27d-4a84-b24e-8cdc01f16920-kube-api-access-49g5t\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.997257 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:43 crc kubenswrapper[4939]: I0318 17:13:43.997271 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c77842-c27d-4a84-b24e-8cdc01f16920-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.509086 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.509428 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-log" containerID="cri-o://ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.509446 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-api" containerID="cri-o://7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.517732 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.517954 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" containerName="nova-scheduler-scheduler" containerID="cri-o://19d5e890462ad4fc1ea3ff511c7e01c77cad68264d5221a83927a56e1e2fdd37" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.608222 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.608811 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.609635 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" event={"ID":"20c77842-c27d-4a84-b24e-8cdc01f16920","Type":"ContainerDied","Data":"2225897e71b12d4fc9c7f92fa50b9ace53b1d3571cfce0ff8516ae0e34626413"} Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.609715 4939 scope.go:117] "RemoveContainer" containerID="0bb4c7b32a3669beebea3c30ccce699b979889dedd9fa2732b675c8b5932fe6e" Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.609850 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.622369 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.622937 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-log" containerID="cri-o://295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.623118 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-metadata" containerID="cri-o://5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.636589 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.636860 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="69109a3f-f7d3-48db-a867-813bc5f6929d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3" gracePeriod=30 Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.644898 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.653121 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-pgz2z"] Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.682682 4939 scope.go:117] "RemoveContainer" containerID="1f8c7ccc7babcae303b6fd4a46b65851b8735261842d6bf7a1e5bd1c894848b5" Mar 18 17:13:44 crc kubenswrapper[4939]: I0318 17:13:44.901242 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.101:6080/vnc_lite.html\": dial tcp 10.217.1.101:6080: connect: connection refused" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.299730 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.447935 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.550059 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzs62\" (UniqueName: \"kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62\") pod \"c9eee740-5028-41c2-b9ba-1c18218d131e\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.550123 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") pod \"c9eee740-5028-41c2-b9ba-1c18218d131e\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.550234 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data\") pod \"c9eee740-5028-41c2-b9ba-1c18218d131e\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.573968 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62" (OuterVolumeSpecName: "kube-api-access-mzs62") pod "c9eee740-5028-41c2-b9ba-1c18218d131e" (UID: "c9eee740-5028-41c2-b9ba-1c18218d131e"). InnerVolumeSpecName "kube-api-access-mzs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.651785 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzs62\" (UniqueName: \"kubernetes.io/projected/c9eee740-5028-41c2-b9ba-1c18218d131e-kube-api-access-mzs62\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:45 crc kubenswrapper[4939]: E0318 17:13:45.693734 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle podName:c9eee740-5028-41c2-b9ba-1c18218d131e nodeName:}" failed. No retries permitted until 2026-03-18 17:13:46.19370801 +0000 UTC m=+5790.792895631 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle") pod "c9eee740-5028-41c2-b9ba-1c18218d131e" (UID: "c9eee740-5028-41c2-b9ba-1c18218d131e") : error deleting /var/lib/kubelet/pods/c9eee740-5028-41c2-b9ba-1c18218d131e/volume-subpaths: remove /var/lib/kubelet/pods/c9eee740-5028-41c2-b9ba-1c18218d131e/volume-subpaths: no such file or directory Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.708648 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data" (OuterVolumeSpecName: "config-data") pod "c9eee740-5028-41c2-b9ba-1c18218d131e" (UID: "c9eee740-5028-41c2-b9ba-1c18218d131e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.740048 4939 generic.go:334] "Generic (PLEG): container finished" podID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerID="e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0" exitCode=0 Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.740147 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.740165 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9eee740-5028-41c2-b9ba-1c18218d131e","Type":"ContainerDied","Data":"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0"} Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.740199 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9eee740-5028-41c2-b9ba-1c18218d131e","Type":"ContainerDied","Data":"e113c2fe754a54107b05513dae8c8ac8bcf397d92c09bce8f42ee7465b30500c"} Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.740217 4939 scope.go:117] "RemoveContainer" containerID="e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.755171 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.757861 4939 generic.go:334] "Generic (PLEG): container finished" podID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerID="295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588" exitCode=143 Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.757953 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerDied","Data":"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588"} Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.767168 4939 generic.go:334] "Generic (PLEG): container finished" podID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" containerID="19d5e890462ad4fc1ea3ff511c7e01c77cad68264d5221a83927a56e1e2fdd37" exitCode=0 Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.767267 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6","Type":"ContainerDied","Data":"19d5e890462ad4fc1ea3ff511c7e01c77cad68264d5221a83927a56e1e2fdd37"} Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.768536 4939 generic.go:334] "Generic (PLEG): container finished" podID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerID="ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397" exitCode=143 Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.768568 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerDied","Data":"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397"} Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.771541 4939 scope.go:117] "RemoveContainer" containerID="e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0" Mar 18 17:13:45 crc kubenswrapper[4939]: E0318 17:13:45.771859 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0\": container with ID starting with e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0 not found: ID does not exist" containerID="e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.771899 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0"} err="failed to get container status \"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0\": rpc error: code = NotFound desc = could not find container \"e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0\": container with ID starting with e1e87c24be21a934407ee5373829ceb0785cf4f700bdd19b791255fc08f0c0d0 not found: ID does not exist" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.798269 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.958420 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle\") pod \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.958476 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxpw9\" (UniqueName: \"kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9\") pod \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.958701 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data\") pod \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\" (UID: \"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6\") " Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.963140 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9" (OuterVolumeSpecName: "kube-api-access-pxpw9") pod "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" (UID: "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6"). InnerVolumeSpecName "kube-api-access-pxpw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.986235 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" (UID: "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:45 crc kubenswrapper[4939]: I0318 17:13:45.987536 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data" (OuterVolumeSpecName: "config-data") pod "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" (UID: "fa3e625d-ea3f-422e-b4d1-3967a61bb0d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.060708 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.060741 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxpw9\" (UniqueName: \"kubernetes.io/projected/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-kube-api-access-pxpw9\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.060752 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.146987 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" path="/var/lib/kubelet/pods/20c77842-c27d-4a84-b24e-8cdc01f16920/volumes" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.284952 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") pod \"c9eee740-5028-41c2-b9ba-1c18218d131e\" (UID: \"c9eee740-5028-41c2-b9ba-1c18218d131e\") " Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.292011 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9eee740-5028-41c2-b9ba-1c18218d131e" (UID: "c9eee740-5028-41c2-b9ba-1c18218d131e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.387717 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eee740-5028-41c2-b9ba-1c18218d131e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.402074 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.440562 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.449909 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.450310 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="init" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450326 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="init" Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.450340 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450346 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.450366 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" containerName="nova-scheduler-scheduler" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450373 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" containerName="nova-scheduler-scheduler" Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.450391 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="dnsmasq-dns" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450397 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="dnsmasq-dns" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450578 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="dnsmasq-dns" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450599 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" containerName="nova-scheduler-scheduler" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.450612 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.451344 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.454176 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.460066 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.479992 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.590397 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data\") pod \"69109a3f-f7d3-48db-a867-813bc5f6929d\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.590551 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wd8\" (UniqueName: \"kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8\") pod \"69109a3f-f7d3-48db-a867-813bc5f6929d\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.590633 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle\") pod \"69109a3f-f7d3-48db-a867-813bc5f6929d\" (UID: \"69109a3f-f7d3-48db-a867-813bc5f6929d\") " Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.590915 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.591470 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.591668 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rpxm\" (UniqueName: \"kubernetes.io/projected/09e6d4eb-edae-4e88-915c-3e83db439676-kube-api-access-9rpxm\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.595076 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8" (OuterVolumeSpecName: "kube-api-access-72wd8") pod "69109a3f-f7d3-48db-a867-813bc5f6929d" (UID: "69109a3f-f7d3-48db-a867-813bc5f6929d"). InnerVolumeSpecName "kube-api-access-72wd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.620694 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data" (OuterVolumeSpecName: "config-data") pod "69109a3f-f7d3-48db-a867-813bc5f6929d" (UID: "69109a3f-f7d3-48db-a867-813bc5f6929d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.637120 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69109a3f-f7d3-48db-a867-813bc5f6929d" (UID: "69109a3f-f7d3-48db-a867-813bc5f6929d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694277 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694351 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rpxm\" (UniqueName: \"kubernetes.io/projected/09e6d4eb-edae-4e88-915c-3e83db439676-kube-api-access-9rpxm\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694412 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694576 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694593 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wd8\" (UniqueName: \"kubernetes.io/projected/69109a3f-f7d3-48db-a867-813bc5f6929d-kube-api-access-72wd8\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.694952 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69109a3f-f7d3-48db-a867-813bc5f6929d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.697959 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.699221 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e6d4eb-edae-4e88-915c-3e83db439676-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.722342 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rpxm\" (UniqueName: \"kubernetes.io/projected/09e6d4eb-edae-4e88-915c-3e83db439676-kube-api-access-9rpxm\") pod \"nova-cell1-novncproxy-0\" (UID: \"09e6d4eb-edae-4e88-915c-3e83db439676\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.779283 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa3e625d-ea3f-422e-b4d1-3967a61bb0d6","Type":"ContainerDied","Data":"c9b49715b920173953669f186f7f5be3175db31848dfa5cedf9e0c65ce627c93"} Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.779327 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.779338 4939 scope.go:117] "RemoveContainer" containerID="19d5e890462ad4fc1ea3ff511c7e01c77cad68264d5221a83927a56e1e2fdd37" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.780838 4939 generic.go:334] "Generic (PLEG): container finished" podID="69109a3f-f7d3-48db-a867-813bc5f6929d" containerID="31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3" exitCode=0 Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.780937 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.780949 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69109a3f-f7d3-48db-a867-813bc5f6929d","Type":"ContainerDied","Data":"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3"} Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.780970 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69109a3f-f7d3-48db-a867-813bc5f6929d","Type":"ContainerDied","Data":"cc24155d4be65771578682e48f633296813ab9f9e255709d5b978d4135ec28b9"} Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.803536 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.807072 4939 scope.go:117] "RemoveContainer" containerID="31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.823820 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.842659 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.854691 4939 scope.go:117] "RemoveContainer" containerID="31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3" Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.855064 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3\": container with ID starting with 31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3 not found: ID does not exist" containerID="31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.855099 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3"} err="failed to get container status \"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3\": rpc error: code = NotFound desc = could not find container \"31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3\": container with ID starting with 31bc9e758f784b0de43ecbd1e73eda4eb4392e76067d07adc11d1e57009a65f3 not found: ID does not exist" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.865775 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: E0318 17:13:46.866229 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69109a3f-f7d3-48db-a867-813bc5f6929d" containerName="nova-cell1-conductor-conductor" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.866243 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="69109a3f-f7d3-48db-a867-813bc5f6929d" containerName="nova-cell1-conductor-conductor" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.866436 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="69109a3f-f7d3-48db-a867-813bc5f6929d" containerName="nova-cell1-conductor-conductor" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.867176 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.876842 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.885580 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.896573 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.906300 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.935608 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.937117 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.940818 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 17:13:46 crc kubenswrapper[4939]: I0318 17:13:46.966650 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.015558 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.015643 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8nq4\" (UniqueName: \"kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.015803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.117796 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.117857 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.118231 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8nq4\" (UniqueName: \"kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.118278 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.118369 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.118400 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rvf\" (UniqueName: \"kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.126393 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.131624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.146250 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8nq4\" (UniqueName: \"kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4\") pod \"nova-scheduler-0\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.198182 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.222892 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.223075 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rvf\" (UniqueName: \"kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.223223 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.227626 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.227691 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.241756 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rvf\" (UniqueName: \"kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf\") pod \"nova-cell1-conductor-0\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.273281 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.327749 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.640335 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.755447 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 17:13:47 crc kubenswrapper[4939]: W0318 17:13:47.755873 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae755085_923a_4216_8b62_9ec03762749e.slice/crio-91459c8f432d64649b3f3ed3fcf805371610d6fbf1600ea8c404fb1ead35ac92 WatchSource:0}: Error finding container 91459c8f432d64649b3f3ed3fcf805371610d6fbf1600ea8c404fb1ead35ac92: Status 404 returned error can't find the container with id 91459c8f432d64649b3f3ed3fcf805371610d6fbf1600ea8c404fb1ead35ac92 Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.797969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ae755085-923a-4216-8b62-9ec03762749e","Type":"ContainerStarted","Data":"91459c8f432d64649b3f3ed3fcf805371610d6fbf1600ea8c404fb1ead35ac92"} Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.799224 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9","Type":"ContainerStarted","Data":"80fb9f982fc60231dd237397c67c03f2edcfdcde88c70823a38db0d25e2c0716"} Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.801775 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"09e6d4eb-edae-4e88-915c-3e83db439676","Type":"ContainerStarted","Data":"0dc2639ce8c9ee12a92fe268b135bf3641e1361a3fe6816d51f392e41efb9b80"} Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.801821 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"09e6d4eb-edae-4e88-915c-3e83db439676","Type":"ContainerStarted","Data":"cc9fa131252ab0633a0d677c349d0d8e4c9de68430bd57e2ef2b67b2ce1dc325"} Mar 18 17:13:47 crc kubenswrapper[4939]: I0318 17:13:47.824318 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.824298662 podStartE2EDuration="1.824298662s" podCreationTimestamp="2026-03-18 17:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:47.819310571 +0000 UTC m=+5792.418498192" watchObservedRunningTime="2026-03-18 17:13:47.824298662 +0000 UTC m=+5792.423486283" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.018261 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.018843 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="750d2365-1615-47b7-a95d-178356744f89" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" gracePeriod=30 Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.152116 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69109a3f-f7d3-48db-a867-813bc5f6929d" path="/var/lib/kubelet/pods/69109a3f-f7d3-48db-a867-813bc5f6929d/volumes" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.152734 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9eee740-5028-41c2-b9ba-1c18218d131e" path="/var/lib/kubelet/pods/c9eee740-5028-41c2-b9ba-1c18218d131e/volumes" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.153346 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3e625d-ea3f-422e-b4d1-3967a61bb0d6" path="/var/lib/kubelet/pods/fa3e625d-ea3f-422e-b4d1-3967a61bb0d6/volumes" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.185606 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.360695 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljzdt\" (UniqueName: \"kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt\") pod \"9518de39-97b8-419b-9ddf-f8c052c695d5\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.361109 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs\") pod \"9518de39-97b8-419b-9ddf-f8c052c695d5\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.361233 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle\") pod \"9518de39-97b8-419b-9ddf-f8c052c695d5\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.361310 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data\") pod \"9518de39-97b8-419b-9ddf-f8c052c695d5\" (UID: \"9518de39-97b8-419b-9ddf-f8c052c695d5\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.363733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs" (OuterVolumeSpecName: "logs") pod "9518de39-97b8-419b-9ddf-f8c052c695d5" (UID: "9518de39-97b8-419b-9ddf-f8c052c695d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.365558 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt" (OuterVolumeSpecName: "kube-api-access-ljzdt") pod "9518de39-97b8-419b-9ddf-f8c052c695d5" (UID: "9518de39-97b8-419b-9ddf-f8c052c695d5"). InnerVolumeSpecName "kube-api-access-ljzdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.406740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9518de39-97b8-419b-9ddf-f8c052c695d5" (UID: "9518de39-97b8-419b-9ddf-f8c052c695d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.414871 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data" (OuterVolumeSpecName: "config-data") pod "9518de39-97b8-419b-9ddf-f8c052c695d5" (UID: "9518de39-97b8-419b-9ddf-f8c052c695d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.472781 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljzdt\" (UniqueName: \"kubernetes.io/projected/9518de39-97b8-419b-9ddf-f8c052c695d5-kube-api-access-ljzdt\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.472828 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9518de39-97b8-419b-9ddf-f8c052c695d5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.472843 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.472855 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9518de39-97b8-419b-9ddf-f8c052c695d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.484060 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574211 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs\") pod \"45d9258f-7237-41ad-bee3-dcdbf6056b35\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574313 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data\") pod \"45d9258f-7237-41ad-bee3-dcdbf6056b35\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574409 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4qlg\" (UniqueName: \"kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg\") pod \"45d9258f-7237-41ad-bee3-dcdbf6056b35\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574522 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle\") pod \"45d9258f-7237-41ad-bee3-dcdbf6056b35\" (UID: \"45d9258f-7237-41ad-bee3-dcdbf6056b35\") " Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574686 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs" (OuterVolumeSpecName: "logs") pod "45d9258f-7237-41ad-bee3-dcdbf6056b35" (UID: "45d9258f-7237-41ad-bee3-dcdbf6056b35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.574919 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45d9258f-7237-41ad-bee3-dcdbf6056b35-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.580707 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg" (OuterVolumeSpecName: "kube-api-access-s4qlg") pod "45d9258f-7237-41ad-bee3-dcdbf6056b35" (UID: "45d9258f-7237-41ad-bee3-dcdbf6056b35"). InnerVolumeSpecName "kube-api-access-s4qlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.610615 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45d9258f-7237-41ad-bee3-dcdbf6056b35" (UID: "45d9258f-7237-41ad-bee3-dcdbf6056b35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.610862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data" (OuterVolumeSpecName: "config-data") pod "45d9258f-7237-41ad-bee3-dcdbf6056b35" (UID: "45d9258f-7237-41ad-bee3-dcdbf6056b35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.650485 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.653801 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.666450 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.666545 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="750d2365-1615-47b7-a95d-178356744f89" containerName="nova-cell0-conductor-conductor" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.676492 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.676544 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4qlg\" (UniqueName: \"kubernetes.io/projected/45d9258f-7237-41ad-bee3-dcdbf6056b35-kube-api-access-s4qlg\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.676555 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d9258f-7237-41ad-bee3-dcdbf6056b35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.699633 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57958c8f89-pgz2z" podUID="20c77842-c27d-4a84-b24e-8cdc01f16920" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.113:5353: i/o timeout" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.820578 4939 generic.go:334] "Generic (PLEG): container finished" podID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerID="5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21" exitCode=0 Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.820682 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerDied","Data":"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.820713 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9518de39-97b8-419b-9ddf-f8c052c695d5","Type":"ContainerDied","Data":"41bd276f662d800c85e1e74a1e06b54940d4883828cbf7b6538446cc1dd0965b"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.820733 4939 scope.go:117] "RemoveContainer" containerID="5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.820731 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.827219 4939 generic.go:334] "Generic (PLEG): container finished" podID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerID="7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f" exitCode=0 Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.827289 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerDied","Data":"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.827326 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45d9258f-7237-41ad-bee3-dcdbf6056b35","Type":"ContainerDied","Data":"6c6eece8c7f4ee12f5816fe37253cabcd012b61909e5d3231463a5b3f7282936"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.827474 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.837337 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ae755085-923a-4216-8b62-9ec03762749e","Type":"ContainerStarted","Data":"4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.839852 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9","Type":"ContainerStarted","Data":"d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74"} Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.858048 4939 scope.go:117] "RemoveContainer" containerID="295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.877972 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.877949712 podStartE2EDuration="2.877949712s" podCreationTimestamp="2026-03-18 17:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:48.859673243 +0000 UTC m=+5793.458860864" watchObservedRunningTime="2026-03-18 17:13:48.877949712 +0000 UTC m=+5793.477137333" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.889540 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.900587 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.915582 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.924319 4939 scope.go:117] "RemoveContainer" containerID="5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.924464 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.925165 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21\": container with ID starting with 5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21 not found: ID does not exist" containerID="5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.925194 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21"} err="failed to get container status \"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21\": rpc error: code = NotFound desc = could not find container \"5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21\": container with ID starting with 5767a8df403662f43a3208a0c44efa074f94d60bbfe69365abf52fc1ee19ff21 not found: ID does not exist" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.925217 4939 scope.go:117] "RemoveContainer" containerID="295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588" Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.925658 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588\": container with ID starting with 295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588 not found: ID does not exist" containerID="295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.925691 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588"} err="failed to get container status \"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588\": rpc error: code = NotFound desc = could not find container \"295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588\": container with ID starting with 295ec99863630299936f968f4b560390a1b000ececca150bebd4be9a0bada588 not found: ID does not exist" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.925708 4939 scope.go:117] "RemoveContainer" containerID="7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.936334 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.937144 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-log" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937174 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-log" Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.937200 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-api" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937208 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-api" Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.937243 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-metadata" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937252 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-metadata" Mar 18 17:13:48 crc kubenswrapper[4939]: E0318 17:13:48.937280 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-log" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937291 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-log" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937614 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-api" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937634 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-metadata" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937649 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" containerName="nova-metadata-log" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.937664 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" containerName="nova-api-log" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.939258 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.945600 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.956466 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.961264 4939 scope.go:117] "RemoveContainer" containerID="ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.962715 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.967097 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.981970 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.982946 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.982987 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983031 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjh4h\" (UniqueName: \"kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983086 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983121 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2vk\" (UniqueName: \"kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983141 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983161 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.983185 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:48 crc kubenswrapper[4939]: I0318 17:13:48.988295 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.988270893 podStartE2EDuration="2.988270893s" podCreationTimestamp="2026-03-18 17:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:48.944756678 +0000 UTC m=+5793.543944319" watchObservedRunningTime="2026-03-18 17:13:48.988270893 +0000 UTC m=+5793.587458514" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.015848 4939 scope.go:117] "RemoveContainer" containerID="7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f" Mar 18 17:13:49 crc kubenswrapper[4939]: E0318 17:13:49.019414 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f\": container with ID starting with 7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f not found: ID does not exist" containerID="7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.019456 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f"} err="failed to get container status \"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f\": rpc error: code = NotFound desc = could not find container \"7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f\": container with ID starting with 7e251c87cfedc49d3e387f7dbc303a7d368bd4b5a58b5cbfbc1d5025878baa4f not found: ID does not exist" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.019483 4939 scope.go:117] "RemoveContainer" containerID="ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397" Mar 18 17:13:49 crc kubenswrapper[4939]: E0318 17:13:49.025811 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397\": container with ID starting with ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397 not found: ID does not exist" containerID="ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.025848 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397"} err="failed to get container status \"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397\": rpc error: code = NotFound desc = could not find container \"ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397\": container with ID starting with ff6cec37776249c7b05779aa3d3dedb28621f73f882e92f66737c68721b51397 not found: ID does not exist" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.025887 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.086801 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjh4h\" (UniqueName: \"kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.086962 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087055 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2vk\" (UniqueName: \"kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087105 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087128 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087186 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087573 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.087611 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.088936 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.089265 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.095297 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.098523 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.099111 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.107370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.108903 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjh4h\" (UniqueName: \"kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h\") pod \"nova-metadata-0\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.116292 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2vk\" (UniqueName: \"kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk\") pod \"nova-api-0\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.297951 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.298662 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.819335 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.832210 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 17:13:49 crc kubenswrapper[4939]: W0318 17:13:49.840228 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7adbcc7d_84f9_47ff_85a2_ec56be833188.slice/crio-dc91d16ed7b1142ce2dddf3168f1155e379a527c5bd95b2903a83017b9843dd4 WatchSource:0}: Error finding container dc91d16ed7b1142ce2dddf3168f1155e379a527c5bd95b2903a83017b9843dd4: Status 404 returned error can't find the container with id dc91d16ed7b1142ce2dddf3168f1155e379a527c5bd95b2903a83017b9843dd4 Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.873049 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerStarted","Data":"460894799a4ee49e0ab7856f7975a2136fe9ce175e90e2589093705a1270e906"} Mar 18 17:13:49 crc kubenswrapper[4939]: I0318 17:13:49.874762 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.143993 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d9258f-7237-41ad-bee3-dcdbf6056b35" path="/var/lib/kubelet/pods/45d9258f-7237-41ad-bee3-dcdbf6056b35/volumes" Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.145017 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9518de39-97b8-419b-9ddf-f8c052c695d5" path="/var/lib/kubelet/pods/9518de39-97b8-419b-9ddf-f8c052c695d5/volumes" Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.880770 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerStarted","Data":"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451"} Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.881085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerStarted","Data":"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad"} Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.883719 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerStarted","Data":"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397"} Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.883767 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerStarted","Data":"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e"} Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.883778 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerStarted","Data":"dc91d16ed7b1142ce2dddf3168f1155e379a527c5bd95b2903a83017b9843dd4"} Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.908537 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.908493385 podStartE2EDuration="2.908493385s" podCreationTimestamp="2026-03-18 17:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:50.897664868 +0000 UTC m=+5795.496852489" watchObservedRunningTime="2026-03-18 17:13:50.908493385 +0000 UTC m=+5795.507681006" Mar 18 17:13:50 crc kubenswrapper[4939]: I0318 17:13:50.932551 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.932529347 podStartE2EDuration="2.932529347s" podCreationTimestamp="2026-03-18 17:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:50.924394756 +0000 UTC m=+5795.523582377" watchObservedRunningTime="2026-03-18 17:13:50.932529347 +0000 UTC m=+5795.531716968" Mar 18 17:13:51 crc kubenswrapper[4939]: I0318 17:13:51.805119 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.198602 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.309292 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.813025 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.853593 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle\") pod \"750d2365-1615-47b7-a95d-178356744f89\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.853678 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data\") pod \"750d2365-1615-47b7-a95d-178356744f89\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.853779 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrxdh\" (UniqueName: \"kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh\") pod \"750d2365-1615-47b7-a95d-178356744f89\" (UID: \"750d2365-1615-47b7-a95d-178356744f89\") " Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.866733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh" (OuterVolumeSpecName: "kube-api-access-vrxdh") pod "750d2365-1615-47b7-a95d-178356744f89" (UID: "750d2365-1615-47b7-a95d-178356744f89"). InnerVolumeSpecName "kube-api-access-vrxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.882365 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "750d2365-1615-47b7-a95d-178356744f89" (UID: "750d2365-1615-47b7-a95d-178356744f89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.882947 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data" (OuterVolumeSpecName: "config-data") pod "750d2365-1615-47b7-a95d-178356744f89" (UID: "750d2365-1615-47b7-a95d-178356744f89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.898790 4939 generic.go:334] "Generic (PLEG): container finished" podID="750d2365-1615-47b7-a95d-178356744f89" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" exitCode=0 Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.898828 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"750d2365-1615-47b7-a95d-178356744f89","Type":"ContainerDied","Data":"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77"} Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.898853 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"750d2365-1615-47b7-a95d-178356744f89","Type":"ContainerDied","Data":"8fc21185668beb436ce40b90f59d0ad4d27f16ff52a8b9dd03aca6dbb46e6f07"} Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.898870 4939 scope.go:117] "RemoveContainer" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.898977 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.952024 4939 scope.go:117] "RemoveContainer" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" Mar 18 17:13:52 crc kubenswrapper[4939]: E0318 17:13:52.952786 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77\": container with ID starting with 3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77 not found: ID does not exist" containerID="3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.952853 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77"} err="failed to get container status \"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77\": rpc error: code = NotFound desc = could not find container \"3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77\": container with ID starting with 3ac54d29f36d20f34e6b87d5eb1b0db1aabfc4a85c09f696c33aa6b713788a77 not found: ID does not exist" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.954901 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrxdh\" (UniqueName: \"kubernetes.io/projected/750d2365-1615-47b7-a95d-178356744f89-kube-api-access-vrxdh\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.954935 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.954947 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750d2365-1615-47b7-a95d-178356744f89-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.971795 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:52 crc kubenswrapper[4939]: I0318 17:13:52.984464 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.000447 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:53 crc kubenswrapper[4939]: E0318 17:13:53.000873 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750d2365-1615-47b7-a95d-178356744f89" containerName="nova-cell0-conductor-conductor" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.000886 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="750d2365-1615-47b7-a95d-178356744f89" containerName="nova-cell0-conductor-conductor" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.001101 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="750d2365-1615-47b7-a95d-178356744f89" containerName="nova-cell0-conductor-conductor" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.001700 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.001778 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.023585 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.056461 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.056528 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.056619 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xwz\" (UniqueName: \"kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.157817 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.158016 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.158131 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xwz\" (UniqueName: \"kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.161996 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.165061 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.174379 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xwz\" (UniqueName: \"kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz\") pod \"nova-cell0-conductor-0\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.349103 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.835293 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 17:13:53 crc kubenswrapper[4939]: W0318 17:13:53.848075 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5c5d50_0a4d_4b21_a02e_753b33c8a592.slice/crio-2140b85215d74bc94726ac6a50693bc975e66570f9ab47c001670f5eae33f9e5 WatchSource:0}: Error finding container 2140b85215d74bc94726ac6a50693bc975e66570f9ab47c001670f5eae33f9e5: Status 404 returned error can't find the container with id 2140b85215d74bc94726ac6a50693bc975e66570f9ab47c001670f5eae33f9e5 Mar 18 17:13:53 crc kubenswrapper[4939]: I0318 17:13:53.913803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a5c5d50-0a4d-4b21-a02e-753b33c8a592","Type":"ContainerStarted","Data":"2140b85215d74bc94726ac6a50693bc975e66570f9ab47c001670f5eae33f9e5"} Mar 18 17:13:54 crc kubenswrapper[4939]: I0318 17:13:54.143578 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750d2365-1615-47b7-a95d-178356744f89" path="/var/lib/kubelet/pods/750d2365-1615-47b7-a95d-178356744f89/volumes" Mar 18 17:13:54 crc kubenswrapper[4939]: I0318 17:13:54.922659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a5c5d50-0a4d-4b21-a02e-753b33c8a592","Type":"ContainerStarted","Data":"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf"} Mar 18 17:13:54 crc kubenswrapper[4939]: I0318 17:13:54.923014 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 17:13:54 crc kubenswrapper[4939]: I0318 17:13:54.944927 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.94490953 podStartE2EDuration="2.94490953s" podCreationTimestamp="2026-03-18 17:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:54.936333147 +0000 UTC m=+5799.535520768" watchObservedRunningTime="2026-03-18 17:13:54.94490953 +0000 UTC m=+5799.544097151" Mar 18 17:13:56 crc kubenswrapper[4939]: I0318 17:13:56.804554 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:56 crc kubenswrapper[4939]: I0318 17:13:56.817316 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:56 crc kubenswrapper[4939]: I0318 17:13:56.951446 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 17:13:57 crc kubenswrapper[4939]: I0318 17:13:57.198744 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 17:13:57 crc kubenswrapper[4939]: I0318 17:13:57.223906 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 17:13:58 crc kubenswrapper[4939]: I0318 17:13:58.012852 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 17:13:59 crc kubenswrapper[4939]: I0318 17:13:59.298595 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:13:59 crc kubenswrapper[4939]: I0318 17:13:59.298935 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 17:13:59 crc kubenswrapper[4939]: I0318 17:13:59.298950 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:13:59 crc kubenswrapper[4939]: I0318 17:13:59.298962 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.143789 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564234-zs8gk"] Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.144813 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-zs8gk"] Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.144913 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.148054 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.149884 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.150123 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.207761 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw8qf\" (UniqueName: \"kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf\") pod \"auto-csr-approver-29564234-zs8gk\" (UID: \"80da288e-4fb5-4729-a51d-4f15c5a3b25e\") " pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.309476 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw8qf\" (UniqueName: \"kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf\") pod \"auto-csr-approver-29564234-zs8gk\" (UID: \"80da288e-4fb5-4729-a51d-4f15c5a3b25e\") " pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.386430 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw8qf\" (UniqueName: \"kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf\") pod \"auto-csr-approver-29564234-zs8gk\" (UID: \"80da288e-4fb5-4729-a51d-4f15c5a3b25e\") " pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.466738 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.466867 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.122:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.466949 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.123:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.466981 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.122:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 17:14:00 crc kubenswrapper[4939]: I0318 17:14:00.482012 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:01 crc kubenswrapper[4939]: W0318 17:14:01.046207 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80da288e_4fb5_4729_a51d_4f15c5a3b25e.slice/crio-15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42 WatchSource:0}: Error finding container 15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42: Status 404 returned error can't find the container with id 15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42 Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.073306 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-zs8gk"] Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.924645 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.926405 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.928370 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934471 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934582 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934607 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934649 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vplk\" (UniqueName: \"kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934775 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.934844 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:01 crc kubenswrapper[4939]: I0318 17:14:01.944247 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.005381 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" event={"ID":"80da288e-4fb5-4729-a51d-4f15c5a3b25e","Type":"ContainerStarted","Data":"15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42"} Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.036908 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.036996 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.037054 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.037075 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.037114 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vplk\" (UniqueName: \"kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.037172 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.037183 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.042919 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.044191 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.053122 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.053766 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vplk\" (UniqueName: \"kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.055078 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.259466 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:02 crc kubenswrapper[4939]: I0318 17:14:02.882867 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:02 crc kubenswrapper[4939]: W0318 17:14:02.886566 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1fc2044_1277_4825_bdfe_97b95f15dac5.slice/crio-c07625ef4b78a048b405a3bceccb2880d3e3ab23168a70cf7664098bb8a4eeda WatchSource:0}: Error finding container c07625ef4b78a048b405a3bceccb2880d3e3ab23168a70cf7664098bb8a4eeda: Status 404 returned error can't find the container with id c07625ef4b78a048b405a3bceccb2880d3e3ab23168a70cf7664098bb8a4eeda Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.018901 4939 generic.go:334] "Generic (PLEG): container finished" podID="80da288e-4fb5-4729-a51d-4f15c5a3b25e" containerID="b10d742b00b991fc7c5fda16c8488cd4feeaf285bd5bf5551895256cc4ce7394" exitCode=0 Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.018956 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" event={"ID":"80da288e-4fb5-4729-a51d-4f15c5a3b25e","Type":"ContainerDied","Data":"b10d742b00b991fc7c5fda16c8488cd4feeaf285bd5bf5551895256cc4ce7394"} Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.021618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerStarted","Data":"c07625ef4b78a048b405a3bceccb2880d3e3ab23168a70cf7664098bb8a4eeda"} Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.323016 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.323454 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api-log" containerID="cri-o://3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9" gracePeriod=30 Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.323974 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api" containerID="cri-o://15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79" gracePeriod=30 Mar 18 17:14:03 crc kubenswrapper[4939]: I0318 17:14:03.379153 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.032608 4939 generic.go:334] "Generic (PLEG): container finished" podID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerID="3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9" exitCode=143 Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.032906 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerDied","Data":"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9"} Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.037177 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerStarted","Data":"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23"} Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.129826 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.132199 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.136848 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.159902 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.234820 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235179 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235245 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235296 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235329 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235357 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235385 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235411 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235436 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235461 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.235497 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.236317 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.236489 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.236649 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.236848 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.236940 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvldc\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-kube-api-access-pvldc\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338411 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338442 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvldc\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-kube-api-access-pvldc\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338563 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338596 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338628 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.338728 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339006 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339042 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339116 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339242 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339263 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339311 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339336 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339433 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339554 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.339669 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.340232 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.340266 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.340394 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.340705 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.341121 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.341144 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.341211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.341455 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9cdb31af-1ef0-499e-9400-ce05d06c693f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.344120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.347632 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.347757 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.348203 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.349558 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cdb31af-1ef0-499e-9400-ce05d06c693f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.355142 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvldc\" (UniqueName: \"kubernetes.io/projected/9cdb31af-1ef0-499e-9400-ce05d06c693f-kube-api-access-pvldc\") pod \"cinder-volume-volume1-0\" (UID: \"9cdb31af-1ef0-499e-9400-ce05d06c693f\") " pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.445117 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.457948 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.544192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw8qf\" (UniqueName: \"kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf\") pod \"80da288e-4fb5-4729-a51d-4f15c5a3b25e\" (UID: \"80da288e-4fb5-4729-a51d-4f15c5a3b25e\") " Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.549095 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf" (OuterVolumeSpecName: "kube-api-access-nw8qf") pod "80da288e-4fb5-4729-a51d-4f15c5a3b25e" (UID: "80da288e-4fb5-4729-a51d-4f15c5a3b25e"). InnerVolumeSpecName "kube-api-access-nw8qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.647272 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw8qf\" (UniqueName: \"kubernetes.io/projected/80da288e-4fb5-4729-a51d-4f15c5a3b25e-kube-api-access-nw8qf\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.755358 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 18 17:14:04 crc kubenswrapper[4939]: E0318 17:14:04.756211 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da288e-4fb5-4729-a51d-4f15c5a3b25e" containerName="oc" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.756229 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da288e-4fb5-4729-a51d-4f15c5a3b25e" containerName="oc" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.759913 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="80da288e-4fb5-4729-a51d-4f15c5a3b25e" containerName="oc" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.762551 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.766670 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.787021 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.955932 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.955989 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-dev\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956009 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956034 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956058 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956077 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956134 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956159 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-sys\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956174 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956210 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-ceph\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956227 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfx2\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-kube-api-access-9hfx2\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956264 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-scripts\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956283 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956307 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956332 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-lib-modules\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:04 crc kubenswrapper[4939]: I0318 17:14:04.956351 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-run\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.047172 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerStarted","Data":"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef"} Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.048916 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" event={"ID":"80da288e-4fb5-4729-a51d-4f15c5a3b25e","Type":"ContainerDied","Data":"15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42"} Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.048951 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fc9cc2e1b1d8822b4e46a2c78fe5d7f2688e3e6f72d9b956e06a0f253d2d42" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.049056 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-zs8gk" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058055 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-dev\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058138 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058162 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-dev\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058179 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058246 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058268 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058309 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058382 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058425 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-sys\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058489 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058478 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058589 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-ceph\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058640 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058535 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-sys\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058678 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfx2\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-kube-api-access-9hfx2\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058924 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-scripts\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.058978 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059039 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059107 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-lib-modules\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: W0318 17:14:05.059129 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdb31af_1ef0_499e_9400_ce05d06c693f.slice/crio-7bbc763d4c2266cd8ad6a5d77ddae31fba77f00fa4748e8273112fbd811eecc9 WatchSource:0}: Error finding container 7bbc763d4c2266cd8ad6a5d77ddae31fba77f00fa4748e8273112fbd811eecc9: Status 404 returned error can't find the container with id 7bbc763d4c2266cd8ad6a5d77ddae31fba77f00fa4748e8273112fbd811eecc9 Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059146 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-run\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059194 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-run\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059212 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059625 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-lib-modules\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.059719 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/430abd0b-b430-4929-87c2-76529cddc1be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.065230 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.065924 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-config-data\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.066270 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-ceph\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.066908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-scripts\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.068161 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430abd0b-b430-4929-87c2-76529cddc1be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.078326 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.080180 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.080145386 podStartE2EDuration="4.080145386s" podCreationTimestamp="2026-03-18 17:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:14:05.072098098 +0000 UTC m=+5809.671285739" watchObservedRunningTime="2026-03-18 17:14:05.080145386 +0000 UTC m=+5809.679333007" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.081432 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfx2\" (UniqueName: \"kubernetes.io/projected/430abd0b-b430-4929-87c2-76529cddc1be-kube-api-access-9hfx2\") pod \"cinder-backup-0\" (UID: \"430abd0b-b430-4929-87c2-76529cddc1be\") " pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.097082 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.510808 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-mf8v6"] Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.520288 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-mf8v6"] Mar 18 17:14:05 crc kubenswrapper[4939]: I0318 17:14:05.638960 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 17:14:05 crc kubenswrapper[4939]: W0318 17:14:05.647576 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod430abd0b_b430_4929_87c2_76529cddc1be.slice/crio-f1d3cae8876514919305526a3621a9411b4e96d37a68560def2fdb661a076231 WatchSource:0}: Error finding container f1d3cae8876514919305526a3621a9411b4e96d37a68560def2fdb661a076231: Status 404 returned error can't find the container with id f1d3cae8876514919305526a3621a9411b4e96d37a68560def2fdb661a076231 Mar 18 17:14:06 crc kubenswrapper[4939]: I0318 17:14:06.061862 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"430abd0b-b430-4929-87c2-76529cddc1be","Type":"ContainerStarted","Data":"f1d3cae8876514919305526a3621a9411b4e96d37a68560def2fdb661a076231"} Mar 18 17:14:06 crc kubenswrapper[4939]: I0318 17:14:06.063064 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9cdb31af-1ef0-499e-9400-ce05d06c693f","Type":"ContainerStarted","Data":"7bbc763d4c2266cd8ad6a5d77ddae31fba77f00fa4748e8273112fbd811eecc9"} Mar 18 17:14:06 crc kubenswrapper[4939]: I0318 17:14:06.146386 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f275e3e-f0bb-429f-beb3-193350cd1916" path="/var/lib/kubelet/pods/7f275e3e-f0bb-429f-beb3-193350cd1916/volumes" Mar 18 17:14:06 crc kubenswrapper[4939]: I0318 17:14:06.478788 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.118:8776/healthcheck\": read tcp 10.217.0.2:52404->10.217.1.118:8776: read: connection reset by peer" Mar 18 17:14:06 crc kubenswrapper[4939]: I0318 17:14:06.832045 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.011404 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxmhv\" (UniqueName: \"kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.011691 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.011729 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.011810 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.012369 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.012429 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.012509 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id\") pod \"7c90e5c3-261f-4fe8-906a-4389d754d057\" (UID: \"7c90e5c3-261f-4fe8-906a-4389d754d057\") " Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.012493 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs" (OuterVolumeSpecName: "logs") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.012672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.013655 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c90e5c3-261f-4fe8-906a-4389d754d057-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.013695 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c90e5c3-261f-4fe8-906a-4389d754d057-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.019914 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts" (OuterVolumeSpecName: "scripts") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.020095 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv" (OuterVolumeSpecName: "kube-api-access-sxmhv") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "kube-api-access-sxmhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.023161 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.038420 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.073656 4939 generic.go:334] "Generic (PLEG): container finished" podID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerID="15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79" exitCode=0 Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.073703 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.074001 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerDied","Data":"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79"} Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.074052 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7c90e5c3-261f-4fe8-906a-4389d754d057","Type":"ContainerDied","Data":"225a75057764e312d19d7d4ae9b51c682dc20ddf7ebfc1533ba2b8c0036ec5cc"} Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.074070 4939 scope.go:117] "RemoveContainer" containerID="15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.075731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"430abd0b-b430-4929-87c2-76529cddc1be","Type":"ContainerStarted","Data":"f39c4f154ea3d98f1b5b2d5cdda6416eea12eee4d360187c4b822d2dabcfa926"} Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.077857 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9cdb31af-1ef0-499e-9400-ce05d06c693f","Type":"ContainerStarted","Data":"9f05118a368bfe0198e57b5c92824e8554bd264aef1c1f3202df0ac2e3605294"} Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.077919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9cdb31af-1ef0-499e-9400-ce05d06c693f","Type":"ContainerStarted","Data":"12e206515b450bf2aa999417df4432adb32d26d469415329f817e3d3fc1f4916"} Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.086265 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data" (OuterVolumeSpecName: "config-data") pod "7c90e5c3-261f-4fe8-906a-4389d754d057" (UID: "7c90e5c3-261f-4fe8-906a-4389d754d057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.106073 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.986968229 podStartE2EDuration="3.106054657s" podCreationTimestamp="2026-03-18 17:14:04 +0000 UTC" firstStartedPulling="2026-03-18 17:14:05.061997271 +0000 UTC m=+5809.661184902" lastFinishedPulling="2026-03-18 17:14:06.181083709 +0000 UTC m=+5810.780271330" observedRunningTime="2026-03-18 17:14:07.103586927 +0000 UTC m=+5811.702774548" watchObservedRunningTime="2026-03-18 17:14:07.106054657 +0000 UTC m=+5811.705242278" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.115113 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxmhv\" (UniqueName: \"kubernetes.io/projected/7c90e5c3-261f-4fe8-906a-4389d754d057-kube-api-access-sxmhv\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.115154 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.115166 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.115176 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.115187 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c90e5c3-261f-4fe8-906a-4389d754d057-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.121879 4939 scope.go:117] "RemoveContainer" containerID="3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.139014 4939 scope.go:117] "RemoveContainer" containerID="15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79" Mar 18 17:14:07 crc kubenswrapper[4939]: E0318 17:14:07.139385 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79\": container with ID starting with 15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79 not found: ID does not exist" containerID="15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.139452 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79"} err="failed to get container status \"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79\": rpc error: code = NotFound desc = could not find container \"15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79\": container with ID starting with 15abf0b67e3bb87ab19b7debcd2d4374e684c4b4431fe49e5a0a203fd0204b79 not found: ID does not exist" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.139490 4939 scope.go:117] "RemoveContainer" containerID="3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9" Mar 18 17:14:07 crc kubenswrapper[4939]: E0318 17:14:07.139825 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9\": container with ID starting with 3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9 not found: ID does not exist" containerID="3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.139847 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9"} err="failed to get container status \"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9\": rpc error: code = NotFound desc = could not find container \"3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9\": container with ID starting with 3303c1723c5de1a02d9af33d33c47e2c2918cd489aad8968bad69ae5b44110d9 not found: ID does not exist" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.260433 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.298839 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.298965 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.299186 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.299369 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.416083 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.455851 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.485935 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:07 crc kubenswrapper[4939]: E0318 17:14:07.489001 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.489035 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api" Mar 18 17:14:07 crc kubenswrapper[4939]: E0318 17:14:07.489059 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api-log" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.489066 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api-log" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.489494 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api-log" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.494726 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" containerName="cinder-api" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.497220 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.499774 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.502368 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.625810 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.625855 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.625954 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-scripts\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.625970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a687e985-0518-48b9-84d0-94522b686c53-logs\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.625989 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a687e985-0518-48b9-84d0-94522b686c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.626008 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.626024 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwnn\" (UniqueName: \"kubernetes.io/projected/a687e985-0518-48b9-84d0-94522b686c53-kube-api-access-mjwnn\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727183 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727238 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727332 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-scripts\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727348 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a687e985-0518-48b9-84d0-94522b686c53-logs\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727369 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a687e985-0518-48b9-84d0-94522b686c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727425 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.727444 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwnn\" (UniqueName: \"kubernetes.io/projected/a687e985-0518-48b9-84d0-94522b686c53-kube-api-access-mjwnn\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.728447 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a687e985-0518-48b9-84d0-94522b686c53-logs\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.728525 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a687e985-0518-48b9-84d0-94522b686c53-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.731948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-scripts\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.732998 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.733115 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.734090 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a687e985-0518-48b9-84d0-94522b686c53-config-data-custom\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.746407 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwnn\" (UniqueName: \"kubernetes.io/projected/a687e985-0518-48b9-84d0-94522b686c53-kube-api-access-mjwnn\") pod \"cinder-api-0\" (UID: \"a687e985-0518-48b9-84d0-94522b686c53\") " pod="openstack/cinder-api-0" Mar 18 17:14:07 crc kubenswrapper[4939]: I0318 17:14:07.878648 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 17:14:08 crc kubenswrapper[4939]: I0318 17:14:08.107793 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"430abd0b-b430-4929-87c2-76529cddc1be","Type":"ContainerStarted","Data":"f0e3baf676604ed182f215d1a918fe5b0a4150b9c29dce80c739c26c74fbadc9"} Mar 18 17:14:08 crc kubenswrapper[4939]: I0318 17:14:08.145818 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.958508441 podStartE2EDuration="4.145783433s" podCreationTimestamp="2026-03-18 17:14:04 +0000 UTC" firstStartedPulling="2026-03-18 17:14:05.649481673 +0000 UTC m=+5810.248669294" lastFinishedPulling="2026-03-18 17:14:06.836756665 +0000 UTC m=+5811.435944286" observedRunningTime="2026-03-18 17:14:08.139487354 +0000 UTC m=+5812.738674985" watchObservedRunningTime="2026-03-18 17:14:08.145783433 +0000 UTC m=+5812.744971064" Mar 18 17:14:08 crc kubenswrapper[4939]: I0318 17:14:08.150942 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c90e5c3-261f-4fe8-906a-4389d754d057" path="/var/lib/kubelet/pods/7c90e5c3-261f-4fe8-906a-4389d754d057/volumes" Mar 18 17:14:08 crc kubenswrapper[4939]: I0318 17:14:08.389207 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 17:14:08 crc kubenswrapper[4939]: W0318 17:14:08.395631 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda687e985_0518_48b9_84d0_94522b686c53.slice/crio-688ccdff38d16111123429c1c21a8e8aa678238bed440881d6d0514a19d3126a WatchSource:0}: Error finding container 688ccdff38d16111123429c1c21a8e8aa678238bed440881d6d0514a19d3126a: Status 404 returned error can't find the container with id 688ccdff38d16111123429c1c21a8e8aa678238bed440881d6d0514a19d3126a Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.124150 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a687e985-0518-48b9-84d0-94522b686c53","Type":"ContainerStarted","Data":"a9fe15749db17c0a01c89fb6ad412e6e19d7e33c07c9cb16f4954f28ef03ee53"} Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.124699 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a687e985-0518-48b9-84d0-94522b686c53","Type":"ContainerStarted","Data":"688ccdff38d16111123429c1c21a8e8aa678238bed440881d6d0514a19d3126a"} Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.300720 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.302026 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.303103 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.303557 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.306582 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.310627 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.311914 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 17:14:09 crc kubenswrapper[4939]: I0318 17:14:09.459118 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:10 crc kubenswrapper[4939]: I0318 17:14:10.097402 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 18 17:14:10 crc kubenswrapper[4939]: I0318 17:14:10.175586 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 17:14:10 crc kubenswrapper[4939]: I0318 17:14:10.175666 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a687e985-0518-48b9-84d0-94522b686c53","Type":"ContainerStarted","Data":"1d8e368701a0668ed31bb3ef1dd5524cdbc284851b0a565bb26938f6eb46811a"} Mar 18 17:14:10 crc kubenswrapper[4939]: I0318 17:14:10.180967 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 17:14:10 crc kubenswrapper[4939]: I0318 17:14:10.192554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.192525945 podStartE2EDuration="3.192525945s" podCreationTimestamp="2026-03-18 17:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:14:10.177705005 +0000 UTC m=+5814.776892646" watchObservedRunningTime="2026-03-18 17:14:10.192525945 +0000 UTC m=+5814.791713576" Mar 18 17:14:12 crc kubenswrapper[4939]: I0318 17:14:12.456603 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 17:14:12 crc kubenswrapper[4939]: I0318 17:14:12.513327 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:13 crc kubenswrapper[4939]: I0318 17:14:13.171957 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="cinder-scheduler" containerID="cri-o://c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23" gracePeriod=30 Mar 18 17:14:13 crc kubenswrapper[4939]: I0318 17:14:13.172049 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="probe" containerID="cri-o://8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef" gracePeriod=30 Mar 18 17:14:14 crc kubenswrapper[4939]: I0318 17:14:14.182324 4939 generic.go:334] "Generic (PLEG): container finished" podID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerID="8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef" exitCode=0 Mar 18 17:14:14 crc kubenswrapper[4939]: I0318 17:14:14.182371 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerDied","Data":"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef"} Mar 18 17:14:14 crc kubenswrapper[4939]: I0318 17:14:14.713234 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 18 17:14:15 crc kubenswrapper[4939]: I0318 17:14:15.327308 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 18 17:14:15 crc kubenswrapper[4939]: I0318 17:14:15.843690 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018180 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018236 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018283 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018342 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018391 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018425 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vplk\" (UniqueName: \"kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk\") pod \"e1fc2044-1277-4825-bdfe-97b95f15dac5\" (UID: \"e1fc2044-1277-4825-bdfe-97b95f15dac5\") " Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018464 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.018951 4939 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1fc2044-1277-4825-bdfe-97b95f15dac5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.024547 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts" (OuterVolumeSpecName: "scripts") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.025061 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.028710 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk" (OuterVolumeSpecName: "kube-api-access-5vplk") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "kube-api-access-5vplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.069457 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.107925 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data" (OuterVolumeSpecName: "config-data") pod "e1fc2044-1277-4825-bdfe-97b95f15dac5" (UID: "e1fc2044-1277-4825-bdfe-97b95f15dac5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.120644 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.120677 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.120690 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.120699 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fc2044-1277-4825-bdfe-97b95f15dac5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.120708 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vplk\" (UniqueName: \"kubernetes.io/projected/e1fc2044-1277-4825-bdfe-97b95f15dac5-kube-api-access-5vplk\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.202603 4939 generic.go:334] "Generic (PLEG): container finished" podID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerID="c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23" exitCode=0 Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.202650 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerDied","Data":"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23"} Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.202668 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.202698 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1fc2044-1277-4825-bdfe-97b95f15dac5","Type":"ContainerDied","Data":"c07625ef4b78a048b405a3bceccb2880d3e3ab23168a70cf7664098bb8a4eeda"} Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.202717 4939 scope.go:117] "RemoveContainer" containerID="8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.230105 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.243241 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.253555 4939 scope.go:117] "RemoveContainer" containerID="c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.255011 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:16 crc kubenswrapper[4939]: E0318 17:14:16.255478 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="cinder-scheduler" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.255532 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="cinder-scheduler" Mar 18 17:14:16 crc kubenswrapper[4939]: E0318 17:14:16.255550 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="probe" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.255558 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="probe" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.255743 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="cinder-scheduler" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.255758 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" containerName="probe" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.256706 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.259651 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.274728 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.285758 4939 scope.go:117] "RemoveContainer" containerID="8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef" Mar 18 17:14:16 crc kubenswrapper[4939]: E0318 17:14:16.286720 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef\": container with ID starting with 8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef not found: ID does not exist" containerID="8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.286776 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef"} err="failed to get container status \"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef\": rpc error: code = NotFound desc = could not find container \"8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef\": container with ID starting with 8c6a9019e49fac841cf2bdcf8e13a8bcd6b23c5f865e94e0d704d0d06b5d8bef not found: ID does not exist" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.286808 4939 scope.go:117] "RemoveContainer" containerID="c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23" Mar 18 17:14:16 crc kubenswrapper[4939]: E0318 17:14:16.287171 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23\": container with ID starting with c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23 not found: ID does not exist" containerID="c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.287202 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23"} err="failed to get container status \"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23\": rpc error: code = NotFound desc = could not find container \"c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23\": container with ID starting with c612a05d5912d149953739c62c38eb8fc3c92837eac992e4978cb6afcd67da23 not found: ID does not exist" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.425664 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.426011 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.426086 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbc8p\" (UniqueName: \"kubernetes.io/projected/954c59a4-5386-4b38-b56f-835fbc868b42-kube-api-access-vbc8p\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.426130 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.426593 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954c59a4-5386-4b38-b56f-835fbc868b42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.426692 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-scripts\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.528632 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.528835 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954c59a4-5386-4b38-b56f-835fbc868b42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.528888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-scripts\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.528935 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954c59a4-5386-4b38-b56f-835fbc868b42-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.529087 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.529181 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.529555 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbc8p\" (UniqueName: \"kubernetes.io/projected/954c59a4-5386-4b38-b56f-835fbc868b42-kube-api-access-vbc8p\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.532362 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.532385 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-scripts\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.534392 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.536051 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954c59a4-5386-4b38-b56f-835fbc868b42-config-data\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.545970 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbc8p\" (UniqueName: \"kubernetes.io/projected/954c59a4-5386-4b38-b56f-835fbc868b42-kube-api-access-vbc8p\") pod \"cinder-scheduler-0\" (UID: \"954c59a4-5386-4b38-b56f-835fbc868b42\") " pod="openstack/cinder-scheduler-0" Mar 18 17:14:16 crc kubenswrapper[4939]: I0318 17:14:16.581135 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 17:14:17 crc kubenswrapper[4939]: I0318 17:14:17.043687 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 17:14:17 crc kubenswrapper[4939]: W0318 17:14:17.053603 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod954c59a4_5386_4b38_b56f_835fbc868b42.slice/crio-f3b897c16bfbd2cdf64e99b2a4be84d817c8f8d45d73770003a3834860a46ac7 WatchSource:0}: Error finding container f3b897c16bfbd2cdf64e99b2a4be84d817c8f8d45d73770003a3834860a46ac7: Status 404 returned error can't find the container with id f3b897c16bfbd2cdf64e99b2a4be84d817c8f8d45d73770003a3834860a46ac7 Mar 18 17:14:17 crc kubenswrapper[4939]: I0318 17:14:17.215707 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"954c59a4-5386-4b38-b56f-835fbc868b42","Type":"ContainerStarted","Data":"f3b897c16bfbd2cdf64e99b2a4be84d817c8f8d45d73770003a3834860a46ac7"} Mar 18 17:14:18 crc kubenswrapper[4939]: I0318 17:14:18.144394 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1fc2044-1277-4825-bdfe-97b95f15dac5" path="/var/lib/kubelet/pods/e1fc2044-1277-4825-bdfe-97b95f15dac5/volumes" Mar 18 17:14:18 crc kubenswrapper[4939]: I0318 17:14:18.226218 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"954c59a4-5386-4b38-b56f-835fbc868b42","Type":"ContainerStarted","Data":"c78ee807111ab215bdf4d8eec63e13870605d530a25581d82bc5234fd612bf74"} Mar 18 17:14:18 crc kubenswrapper[4939]: I0318 17:14:18.226276 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"954c59a4-5386-4b38-b56f-835fbc868b42","Type":"ContainerStarted","Data":"009557f0d69bfc9955fb64ec0434d9f7951fd556a2dc8a7e26fc14d74b202f7f"} Mar 18 17:14:18 crc kubenswrapper[4939]: I0318 17:14:18.250046 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.250026569 podStartE2EDuration="2.250026569s" podCreationTimestamp="2026-03-18 17:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:14:18.248263789 +0000 UTC m=+5822.847451410" watchObservedRunningTime="2026-03-18 17:14:18.250026569 +0000 UTC m=+5822.849214200" Mar 18 17:14:19 crc kubenswrapper[4939]: I0318 17:14:19.954182 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 17:14:21 crc kubenswrapper[4939]: I0318 17:14:21.582373 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 17:14:26 crc kubenswrapper[4939]: I0318 17:14:26.779824 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 17:14:53 crc kubenswrapper[4939]: I0318 17:14:53.687201 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:14:53 crc kubenswrapper[4939]: I0318 17:14:53.688705 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.178807 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2"] Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.182412 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.186735 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.187052 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.192697 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2"] Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.322395 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.322646 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbgm\" (UniqueName: \"kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.322783 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.424432 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.424599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.424651 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbgm\" (UniqueName: \"kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.425945 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.433755 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.458638 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbgm\" (UniqueName: \"kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm\") pod \"collect-profiles-29564235-qckv2\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:00 crc kubenswrapper[4939]: I0318 17:15:00.516909 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:01 crc kubenswrapper[4939]: I0318 17:15:01.004475 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2"] Mar 18 17:15:01 crc kubenswrapper[4939]: I0318 17:15:01.695194 4939 generic.go:334] "Generic (PLEG): container finished" podID="42d087fd-bcf9-4bac-96c2-5f6d77200b3d" containerID="99c26977dde7e049562192a6d91a202ea8c0a9efa4cd6a9e30c7393fd6c8e997" exitCode=0 Mar 18 17:15:01 crc kubenswrapper[4939]: I0318 17:15:01.695278 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" event={"ID":"42d087fd-bcf9-4bac-96c2-5f6d77200b3d","Type":"ContainerDied","Data":"99c26977dde7e049562192a6d91a202ea8c0a9efa4cd6a9e30c7393fd6c8e997"} Mar 18 17:15:01 crc kubenswrapper[4939]: I0318 17:15:01.695472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" event={"ID":"42d087fd-bcf9-4bac-96c2-5f6d77200b3d","Type":"ContainerStarted","Data":"a8a94403d3c48c193ee77005c2764e52ba0a211370b796d02e281cf305602a55"} Mar 18 17:15:01 crc kubenswrapper[4939]: I0318 17:15:01.796361 4939 scope.go:117] "RemoveContainer" containerID="261ccdb1cdd00a4044764132044a0d1f6ce6af5db010351c9ec5f8b14a5b527a" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.086369 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.177642 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wbgm\" (UniqueName: \"kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm\") pod \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.177904 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume\") pod \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.177944 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume\") pod \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\" (UID: \"42d087fd-bcf9-4bac-96c2-5f6d77200b3d\") " Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.178866 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "42d087fd-bcf9-4bac-96c2-5f6d77200b3d" (UID: "42d087fd-bcf9-4bac-96c2-5f6d77200b3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.186804 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm" (OuterVolumeSpecName: "kube-api-access-2wbgm") pod "42d087fd-bcf9-4bac-96c2-5f6d77200b3d" (UID: "42d087fd-bcf9-4bac-96c2-5f6d77200b3d"). InnerVolumeSpecName "kube-api-access-2wbgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.187317 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42d087fd-bcf9-4bac-96c2-5f6d77200b3d" (UID: "42d087fd-bcf9-4bac-96c2-5f6d77200b3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.280660 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.280705 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.280719 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wbgm\" (UniqueName: \"kubernetes.io/projected/42d087fd-bcf9-4bac-96c2-5f6d77200b3d-kube-api-access-2wbgm\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.722590 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" event={"ID":"42d087fd-bcf9-4bac-96c2-5f6d77200b3d","Type":"ContainerDied","Data":"a8a94403d3c48c193ee77005c2764e52ba0a211370b796d02e281cf305602a55"} Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.722627 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a94403d3c48c193ee77005c2764e52ba0a211370b796d02e281cf305602a55" Mar 18 17:15:03 crc kubenswrapper[4939]: I0318 17:15:03.722684 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2" Mar 18 17:15:04 crc kubenswrapper[4939]: I0318 17:15:04.198495 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx"] Mar 18 17:15:04 crc kubenswrapper[4939]: I0318 17:15:04.208695 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-whgxx"] Mar 18 17:15:06 crc kubenswrapper[4939]: I0318 17:15:06.155121 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f30d02-82a4-4eb5-8a25-ead90d3f76a8" path="/var/lib/kubelet/pods/88f30d02-82a4-4eb5-8a25-ead90d3f76a8/volumes" Mar 18 17:15:23 crc kubenswrapper[4939]: I0318 17:15:23.687769 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:15:23 crc kubenswrapper[4939]: I0318 17:15:23.688346 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.260108 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:15:50 crc kubenswrapper[4939]: E0318 17:15:50.260975 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d087fd-bcf9-4bac-96c2-5f6d77200b3d" containerName="collect-profiles" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.260989 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d087fd-bcf9-4bac-96c2-5f6d77200b3d" containerName="collect-profiles" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.261207 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d087fd-bcf9-4bac-96c2-5f6d77200b3d" containerName="collect-profiles" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.262552 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.272540 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.418530 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.418581 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.419167 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9hr\" (UniqueName: \"kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.520798 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9hr\" (UniqueName: \"kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.520927 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.520959 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.521596 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.522174 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.584486 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9hr\" (UniqueName: \"kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr\") pod \"redhat-marketplace-d8wq2\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:50 crc kubenswrapper[4939]: I0318 17:15:50.602644 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:15:51 crc kubenswrapper[4939]: I0318 17:15:51.235154 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:15:51 crc kubenswrapper[4939]: W0318 17:15:51.237489 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b325fa_c33d_4e7f_aafb_2f380eb10cfd.slice/crio-3e35a211e58a0c1d4f2dd2cb11704c47b60077379e55e5d015ded9b3c32a3fa7 WatchSource:0}: Error finding container 3e35a211e58a0c1d4f2dd2cb11704c47b60077379e55e5d015ded9b3c32a3fa7: Status 404 returned error can't find the container with id 3e35a211e58a0c1d4f2dd2cb11704c47b60077379e55e5d015ded9b3c32a3fa7 Mar 18 17:15:52 crc kubenswrapper[4939]: I0318 17:15:52.206927 4939 generic.go:334] "Generic (PLEG): container finished" podID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerID="7597ad4e16fd444472e621578e22c2b3227f7c07947f63fcdbe0dd3adc8fe5ff" exitCode=0 Mar 18 17:15:52 crc kubenswrapper[4939]: I0318 17:15:52.206974 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerDied","Data":"7597ad4e16fd444472e621578e22c2b3227f7c07947f63fcdbe0dd3adc8fe5ff"} Mar 18 17:15:52 crc kubenswrapper[4939]: I0318 17:15:52.207173 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerStarted","Data":"3e35a211e58a0c1d4f2dd2cb11704c47b60077379e55e5d015ded9b3c32a3fa7"} Mar 18 17:15:53 crc kubenswrapper[4939]: I0318 17:15:53.687048 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:15:53 crc kubenswrapper[4939]: I0318 17:15:53.687822 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:15:53 crc kubenswrapper[4939]: I0318 17:15:53.687888 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:15:53 crc kubenswrapper[4939]: I0318 17:15:53.688637 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:15:53 crc kubenswrapper[4939]: I0318 17:15:53.688716 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" gracePeriod=600 Mar 18 17:15:53 crc kubenswrapper[4939]: E0318 17:15:53.835875 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.224395 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" exitCode=0 Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.224524 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e"} Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.224577 4939 scope.go:117] "RemoveContainer" containerID="75eabee95412ed2c13066257546f720d013a6e1cbe6d5b5074d1fd4e3adf5aa1" Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.225470 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:15:54 crc kubenswrapper[4939]: E0318 17:15:54.226056 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.226817 4939 generic.go:334] "Generic (PLEG): container finished" podID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerID="4503c5a1471221a5e244091d593d64b8515a3152f5e9b91ac007378483019dbd" exitCode=0 Mar 18 17:15:54 crc kubenswrapper[4939]: I0318 17:15:54.226849 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerDied","Data":"4503c5a1471221a5e244091d593d64b8515a3152f5e9b91ac007378483019dbd"} Mar 18 17:15:55 crc kubenswrapper[4939]: I0318 17:15:55.247826 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerStarted","Data":"bbd6fd84796d06228ca44124b1a35d2ffb5ab8c625b2b818b38904939bef3a16"} Mar 18 17:15:55 crc kubenswrapper[4939]: I0318 17:15:55.276158 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d8wq2" podStartSLOduration=2.814728086 podStartE2EDuration="5.276137865s" podCreationTimestamp="2026-03-18 17:15:50 +0000 UTC" firstStartedPulling="2026-03-18 17:15:52.210750416 +0000 UTC m=+5916.809938037" lastFinishedPulling="2026-03-18 17:15:54.672160155 +0000 UTC m=+5919.271347816" observedRunningTime="2026-03-18 17:15:55.271639637 +0000 UTC m=+5919.870827258" watchObservedRunningTime="2026-03-18 17:15:55.276137865 +0000 UTC m=+5919.875325486" Mar 18 17:15:59 crc kubenswrapper[4939]: I0318 17:15:59.052132 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-d4wlf"] Mar 18 17:15:59 crc kubenswrapper[4939]: I0318 17:15:59.062071 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-d4wlf"] Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.028875 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a8b-account-create-update-rqz88"] Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.036923 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a8b-account-create-update-rqz88"] Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.146217 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b01e44d-3e47-4d5a-9372-cda997282916" path="/var/lib/kubelet/pods/4b01e44d-3e47-4d5a-9372-cda997282916/volumes" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.147039 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb72637-02c0-44b4-919a-8691529c611a" path="/var/lib/kubelet/pods/9bb72637-02c0-44b4-919a-8691529c611a/volumes" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.147637 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564236-6zhpw"] Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.149150 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.154161 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.154169 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-6zhpw"] Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.154388 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.154795 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.333897 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf5j5\" (UniqueName: \"kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5\") pod \"auto-csr-approver-29564236-6zhpw\" (UID: \"60071088-a0a4-4fda-8f28-7b0f894191fe\") " pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.436844 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf5j5\" (UniqueName: \"kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5\") pod \"auto-csr-approver-29564236-6zhpw\" (UID: \"60071088-a0a4-4fda-8f28-7b0f894191fe\") " pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.461377 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf5j5\" (UniqueName: \"kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5\") pod \"auto-csr-approver-29564236-6zhpw\" (UID: \"60071088-a0a4-4fda-8f28-7b0f894191fe\") " pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.516303 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.604191 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.604321 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:00 crc kubenswrapper[4939]: I0318 17:16:00.667854 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:01 crc kubenswrapper[4939]: I0318 17:16:01.014392 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-6zhpw"] Mar 18 17:16:01 crc kubenswrapper[4939]: I0318 17:16:01.304564 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" event={"ID":"60071088-a0a4-4fda-8f28-7b0f894191fe","Type":"ContainerStarted","Data":"d6072bc00a0d77ac7eb43c73b1cea367e482999ac74fa49991c2ab67032d9daa"} Mar 18 17:16:01 crc kubenswrapper[4939]: I0318 17:16:01.358885 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:01 crc kubenswrapper[4939]: I0318 17:16:01.931469 4939 scope.go:117] "RemoveContainer" containerID="6a9189fc2d9c8821f21f0cabc434f14ea20488426c3e224067621372529dbabf" Mar 18 17:16:01 crc kubenswrapper[4939]: I0318 17:16:01.951363 4939 scope.go:117] "RemoveContainer" containerID="d4ebffa938d8b676cc2e352530bc3f2bd24bd79a295ac265382cc747f41d6033" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.008093 4939 scope.go:117] "RemoveContainer" containerID="5e57776cb9aec52c6dfd9f5bfb920ed1a3fc5796247636b09c2901087810fa59" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.847976 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-26h2t"] Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.850078 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.855322 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.855932 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-62s22" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.875960 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vnbc8"] Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.883035 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.914637 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t"] Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.929388 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vnbc8"] Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997263 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd7jn\" (UniqueName: \"kubernetes.io/projected/d4639e38-2435-415e-9cd8-d252c596ad22-kube-api-access-zd7jn\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997358 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3acd6a2-80fc-4c56-b460-187b80f55cfb-scripts\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997409 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc5r4\" (UniqueName: \"kubernetes.io/projected/c3acd6a2-80fc-4c56-b460-187b80f55cfb-kube-api-access-pc5r4\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997462 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997598 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-log\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997714 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-etc-ovs\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997790 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-log-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997856 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997918 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-lib\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.997949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4639e38-2435-415e-9cd8-d252c596ad22-scripts\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:02 crc kubenswrapper[4939]: I0318 17:16:02.998179 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-run\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100329 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-run\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100418 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd7jn\" (UniqueName: \"kubernetes.io/projected/d4639e38-2435-415e-9cd8-d252c596ad22-kube-api-access-zd7jn\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3acd6a2-80fc-4c56-b460-187b80f55cfb-scripts\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100530 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc5r4\" (UniqueName: \"kubernetes.io/projected/c3acd6a2-80fc-4c56-b460-187b80f55cfb-kube-api-access-pc5r4\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100591 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100681 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-log\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100745 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-etc-ovs\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100771 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-run\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100800 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-log-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100824 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100843 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100908 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-lib\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100915 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-etc-ovs\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100982 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-lib\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.100990 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-run-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.101010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d4639e38-2435-415e-9cd8-d252c596ad22-var-log\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.101016 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4639e38-2435-415e-9cd8-d252c596ad22-scripts\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.101045 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3acd6a2-80fc-4c56-b460-187b80f55cfb-var-log-ovn\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.102572 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3acd6a2-80fc-4c56-b460-187b80f55cfb-scripts\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.104046 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4639e38-2435-415e-9cd8-d252c596ad22-scripts\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.121283 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc5r4\" (UniqueName: \"kubernetes.io/projected/c3acd6a2-80fc-4c56-b460-187b80f55cfb-kube-api-access-pc5r4\") pod \"ovn-controller-26h2t\" (UID: \"c3acd6a2-80fc-4c56-b460-187b80f55cfb\") " pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.129215 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd7jn\" (UniqueName: \"kubernetes.io/projected/d4639e38-2435-415e-9cd8-d252c596ad22-kube-api-access-zd7jn\") pod \"ovn-controller-ovs-vnbc8\" (UID: \"d4639e38-2435-415e-9cd8-d252c596ad22\") " pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.181254 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.234567 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.329824 4939 generic.go:334] "Generic (PLEG): container finished" podID="60071088-a0a4-4fda-8f28-7b0f894191fe" containerID="7e726726a3a493bf666f889f1c9adb50b600d62ed11907cef0fc2078254c172d" exitCode=0 Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.330021 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" event={"ID":"60071088-a0a4-4fda-8f28-7b0f894191fe","Type":"ContainerDied","Data":"7e726726a3a493bf666f889f1c9adb50b600d62ed11907cef0fc2078254c172d"} Mar 18 17:16:03 crc kubenswrapper[4939]: I0318 17:16:03.554844 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t"] Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.043266 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.043828 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d8wq2" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="registry-server" containerID="cri-o://bbd6fd84796d06228ca44124b1a35d2ffb5ab8c625b2b818b38904939bef3a16" gracePeriod=2 Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.177559 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vnbc8"] Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.357250 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vnbc8" event={"ID":"d4639e38-2435-415e-9cd8-d252c596ad22","Type":"ContainerStarted","Data":"5da12e515b01d4e61ce0ba9335d8722d68d1231650b90fa30a14256416316892"} Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.360114 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t" event={"ID":"c3acd6a2-80fc-4c56-b460-187b80f55cfb","Type":"ContainerStarted","Data":"7348577f85abaae6f4878113022094d89f585014208220a0228e0f731e5b3166"} Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.360166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t" event={"ID":"c3acd6a2-80fc-4c56-b460-187b80f55cfb","Type":"ContainerStarted","Data":"f41c887d4aec2c735141675fca6431187d5da3c4b2812cfa928339321c19a14c"} Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.360212 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-26h2t" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.371256 4939 generic.go:334] "Generic (PLEG): container finished" podID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerID="bbd6fd84796d06228ca44124b1a35d2ffb5ab8c625b2b818b38904939bef3a16" exitCode=0 Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.371339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerDied","Data":"bbd6fd84796d06228ca44124b1a35d2ffb5ab8c625b2b818b38904939bef3a16"} Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.450787 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-26h2t" podStartSLOduration=2.4507650610000002 podStartE2EDuration="2.450765061s" podCreationTimestamp="2026-03-18 17:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:04.414538303 +0000 UTC m=+5929.013725934" watchObservedRunningTime="2026-03-18 17:16:04.450765061 +0000 UTC m=+5929.049952682" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.455572 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l6vq2"] Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.457137 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.468686 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.485459 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l6vq2"] Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.565843 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23757438-8e93-4d99-8df0-ca7e9f63f115-config\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.565909 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovs-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.567438 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovn-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.567572 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb6s6\" (UniqueName: \"kubernetes.io/projected/23757438-8e93-4d99-8df0-ca7e9f63f115-kube-api-access-rb6s6\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.668858 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23757438-8e93-4d99-8df0-ca7e9f63f115-config\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669176 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovs-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669261 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovn-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669345 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb6s6\" (UniqueName: \"kubernetes.io/projected/23757438-8e93-4d99-8df0-ca7e9f63f115-kube-api-access-rb6s6\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669579 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovn-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669585 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/23757438-8e93-4d99-8df0-ca7e9f63f115-ovs-rundir\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.669850 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23757438-8e93-4d99-8df0-ca7e9f63f115-config\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.698781 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb6s6\" (UniqueName: \"kubernetes.io/projected/23757438-8e93-4d99-8df0-ca7e9f63f115-kube-api-access-rb6s6\") pod \"ovn-controller-metrics-l6vq2\" (UID: \"23757438-8e93-4d99-8df0-ca7e9f63f115\") " pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.788328 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l6vq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.818255 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.880774 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.977911 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf5j5\" (UniqueName: \"kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5\") pod \"60071088-a0a4-4fda-8f28-7b0f894191fe\" (UID: \"60071088-a0a4-4fda-8f28-7b0f894191fe\") " Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.978240 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities\") pod \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.978282 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content\") pod \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.978316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn9hr\" (UniqueName: \"kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr\") pod \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\" (UID: \"92b325fa-c33d-4e7f-aafb-2f380eb10cfd\") " Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.979341 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities" (OuterVolumeSpecName: "utilities") pod "92b325fa-c33d-4e7f-aafb-2f380eb10cfd" (UID: "92b325fa-c33d-4e7f-aafb-2f380eb10cfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.984938 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5" (OuterVolumeSpecName: "kube-api-access-gf5j5") pod "60071088-a0a4-4fda-8f28-7b0f894191fe" (UID: "60071088-a0a4-4fda-8f28-7b0f894191fe"). InnerVolumeSpecName "kube-api-access-gf5j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:04 crc kubenswrapper[4939]: I0318 17:16:04.996373 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr" (OuterVolumeSpecName: "kube-api-access-pn9hr") pod "92b325fa-c33d-4e7f-aafb-2f380eb10cfd" (UID: "92b325fa-c33d-4e7f-aafb-2f380eb10cfd"). InnerVolumeSpecName "kube-api-access-pn9hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.013685 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92b325fa-c33d-4e7f-aafb-2f380eb10cfd" (UID: "92b325fa-c33d-4e7f-aafb-2f380eb10cfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.080469 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf5j5\" (UniqueName: \"kubernetes.io/projected/60071088-a0a4-4fda-8f28-7b0f894191fe-kube-api-access-gf5j5\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.080577 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.080588 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.080597 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn9hr\" (UniqueName: \"kubernetes.io/projected/92b325fa-c33d-4e7f-aafb-2f380eb10cfd-kube-api-access-pn9hr\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.307946 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l6vq2"] Mar 18 17:16:05 crc kubenswrapper[4939]: W0318 17:16:05.332111 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23757438_8e93_4d99_8df0_ca7e9f63f115.slice/crio-1a3ee83812d235e53155a79d6eb0ab0ead40fb07d5b03cc31cba7fac8004eb07 WatchSource:0}: Error finding container 1a3ee83812d235e53155a79d6eb0ab0ead40fb07d5b03cc31cba7fac8004eb07: Status 404 returned error can't find the container with id 1a3ee83812d235e53155a79d6eb0ab0ead40fb07d5b03cc31cba7fac8004eb07 Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.398369 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" event={"ID":"60071088-a0a4-4fda-8f28-7b0f894191fe","Type":"ContainerDied","Data":"d6072bc00a0d77ac7eb43c73b1cea367e482999ac74fa49991c2ab67032d9daa"} Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.398420 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6072bc00a0d77ac7eb43c73b1cea367e482999ac74fa49991c2ab67032d9daa" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.398530 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-6zhpw" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.402883 4939 generic.go:334] "Generic (PLEG): container finished" podID="d4639e38-2435-415e-9cd8-d252c596ad22" containerID="e20aa337d8626a3db15bcefa02b7db7aa6ebf368ddf1aac3e249a59f2a6121fb" exitCode=0 Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.402938 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vnbc8" event={"ID":"d4639e38-2435-415e-9cd8-d252c596ad22","Type":"ContainerDied","Data":"e20aa337d8626a3db15bcefa02b7db7aa6ebf368ddf1aac3e249a59f2a6121fb"} Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.421019 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l6vq2" event={"ID":"23757438-8e93-4d99-8df0-ca7e9f63f115","Type":"ContainerStarted","Data":"1a3ee83812d235e53155a79d6eb0ab0ead40fb07d5b03cc31cba7fac8004eb07"} Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.431695 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8wq2" event={"ID":"92b325fa-c33d-4e7f-aafb-2f380eb10cfd","Type":"ContainerDied","Data":"3e35a211e58a0c1d4f2dd2cb11704c47b60077379e55e5d015ded9b3c32a3fa7"} Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.431778 4939 scope.go:117] "RemoveContainer" containerID="bbd6fd84796d06228ca44124b1a35d2ffb5ab8c625b2b818b38904939bef3a16" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.431799 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8wq2" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.477585 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.489006 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8wq2"] Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.567668 4939 scope.go:117] "RemoveContainer" containerID="4503c5a1471221a5e244091d593d64b8515a3152f5e9b91ac007378483019dbd" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.608373 4939 scope.go:117] "RemoveContainer" containerID="7597ad4e16fd444472e621578e22c2b3227f7c07947f63fcdbe0dd3adc8fe5ff" Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.937757 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-d559w"] Mar 18 17:16:05 crc kubenswrapper[4939]: I0318 17:16:05.946202 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-d559w"] Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.028244 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-g97xg"] Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.039516 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-g97xg"] Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.145655 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d05450-1c85-4e2a-980e-ae6abbc2fd42" path="/var/lib/kubelet/pods/29d05450-1c85-4e2a-980e-ae6abbc2fd42/volumes" Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.146531 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc59a41-1322-4779-b5db-07b3d58253ff" path="/var/lib/kubelet/pods/6dc59a41-1322-4779-b5db-07b3d58253ff/volumes" Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.147222 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" path="/var/lib/kubelet/pods/92b325fa-c33d-4e7f-aafb-2f380eb10cfd/volumes" Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.442929 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vnbc8" event={"ID":"d4639e38-2435-415e-9cd8-d252c596ad22","Type":"ContainerStarted","Data":"e9ba202d60186f44e8cb3b6e570dec58bcfc0c549193bc4e1d27f1c572a17904"} Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.442979 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vnbc8" event={"ID":"d4639e38-2435-415e-9cd8-d252c596ad22","Type":"ContainerStarted","Data":"4006813c133fba240aa4d355b423a7e5de08951d508022e251ca453e6d87616c"} Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.443160 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.444908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l6vq2" event={"ID":"23757438-8e93-4d99-8df0-ca7e9f63f115","Type":"ContainerStarted","Data":"a960847fdaae835b1cbbd0e73f8d9534cf2ac41581854bf771b247c68a92d6d7"} Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.465811 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vnbc8" podStartSLOduration=4.465791573 podStartE2EDuration="4.465791573s" podCreationTimestamp="2026-03-18 17:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:06.458474226 +0000 UTC m=+5931.057661857" watchObservedRunningTime="2026-03-18 17:16:06.465791573 +0000 UTC m=+5931.064979194" Mar 18 17:16:06 crc kubenswrapper[4939]: I0318 17:16:06.483964 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l6vq2" podStartSLOduration=2.483941748 podStartE2EDuration="2.483941748s" podCreationTimestamp="2026-03-18 17:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:06.474437939 +0000 UTC m=+5931.073625560" watchObservedRunningTime="2026-03-18 17:16:06.483941748 +0000 UTC m=+5931.083129369" Mar 18 17:16:07 crc kubenswrapper[4939]: I0318 17:16:07.455867 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:08 crc kubenswrapper[4939]: I0318 17:16:08.133970 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:16:08 crc kubenswrapper[4939]: E0318 17:16:08.134201 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.204592 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-5qrb9"] Mar 18 17:16:13 crc kubenswrapper[4939]: E0318 17:16:13.205525 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60071088-a0a4-4fda-8f28-7b0f894191fe" containerName="oc" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205540 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="60071088-a0a4-4fda-8f28-7b0f894191fe" containerName="oc" Mar 18 17:16:13 crc kubenswrapper[4939]: E0318 17:16:13.205555 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="registry-server" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205563 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="registry-server" Mar 18 17:16:13 crc kubenswrapper[4939]: E0318 17:16:13.205579 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="extract-utilities" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205586 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="extract-utilities" Mar 18 17:16:13 crc kubenswrapper[4939]: E0318 17:16:13.205623 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="extract-content" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205630 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="extract-content" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205870 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="60071088-a0a4-4fda-8f28-7b0f894191fe" containerName="oc" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.205884 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b325fa-c33d-4e7f-aafb-2f380eb10cfd" containerName="registry-server" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.206677 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.213354 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5qrb9"] Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.261142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.261273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbf2m\" (UniqueName: \"kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.362741 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbf2m\" (UniqueName: \"kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.362881 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.363734 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.389599 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbf2m\" (UniqueName: \"kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m\") pod \"octavia-db-create-5qrb9\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:13 crc kubenswrapper[4939]: I0318 17:16:13.529072 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.020975 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5qrb9"] Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.527681 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-b5fd-account-create-update-wr4hv"] Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.529415 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.531681 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.539839 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b5fd-account-create-update-wr4hv"] Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.540405 4939 generic.go:334] "Generic (PLEG): container finished" podID="d192b635-1876-423f-83a9-68bc9ab9ba98" containerID="ea4b045645681e62c293bfefbb576073e2fa07c52f079bca371880f1947f65d5" exitCode=0 Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.540447 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5qrb9" event={"ID":"d192b635-1876-423f-83a9-68bc9ab9ba98","Type":"ContainerDied","Data":"ea4b045645681e62c293bfefbb576073e2fa07c52f079bca371880f1947f65d5"} Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.540473 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5qrb9" event={"ID":"d192b635-1876-423f-83a9-68bc9ab9ba98","Type":"ContainerStarted","Data":"fa25bc43486827f5e364893b23292a9fcad43d5e039f074de0ffce732e6e3430"} Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.597685 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.597908 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc9m\" (UniqueName: \"kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.699496 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc9m\" (UniqueName: \"kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.699589 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.700315 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.719583 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc9m\" (UniqueName: \"kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m\") pod \"octavia-b5fd-account-create-update-wr4hv\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:14 crc kubenswrapper[4939]: I0318 17:16:14.851487 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:15 crc kubenswrapper[4939]: W0318 17:16:15.307756 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd97a0adc_34fd_4fbc_aa41_56d51e7eb170.slice/crio-a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3 WatchSource:0}: Error finding container a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3: Status 404 returned error can't find the container with id a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3 Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.309458 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b5fd-account-create-update-wr4hv"] Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.552272 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b5fd-account-create-update-wr4hv" event={"ID":"d97a0adc-34fd-4fbc-aa41-56d51e7eb170","Type":"ContainerStarted","Data":"5aae94ef3b4d56d980be83958fc352314dcd46ca846bf408e174a5b834da9b25"} Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.552353 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b5fd-account-create-update-wr4hv" event={"ID":"d97a0adc-34fd-4fbc-aa41-56d51e7eb170","Type":"ContainerStarted","Data":"a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3"} Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.895286 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.913554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-b5fd-account-create-update-wr4hv" podStartSLOduration=1.91353711 podStartE2EDuration="1.91353711s" podCreationTimestamp="2026-03-18 17:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:15.570492455 +0000 UTC m=+5940.169680096" watchObservedRunningTime="2026-03-18 17:16:15.91353711 +0000 UTC m=+5940.512724731" Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.919487 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts\") pod \"d192b635-1876-423f-83a9-68bc9ab9ba98\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.919848 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbf2m\" (UniqueName: \"kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m\") pod \"d192b635-1876-423f-83a9-68bc9ab9ba98\" (UID: \"d192b635-1876-423f-83a9-68bc9ab9ba98\") " Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.920887 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d192b635-1876-423f-83a9-68bc9ab9ba98" (UID: "d192b635-1876-423f-83a9-68bc9ab9ba98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:15 crc kubenswrapper[4939]: I0318 17:16:15.929958 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m" (OuterVolumeSpecName: "kube-api-access-kbf2m") pod "d192b635-1876-423f-83a9-68bc9ab9ba98" (UID: "d192b635-1876-423f-83a9-68bc9ab9ba98"). InnerVolumeSpecName "kube-api-access-kbf2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.022289 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbf2m\" (UniqueName: \"kubernetes.io/projected/d192b635-1876-423f-83a9-68bc9ab9ba98-kube-api-access-kbf2m\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.022350 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d192b635-1876-423f-83a9-68bc9ab9ba98-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.564156 4939 generic.go:334] "Generic (PLEG): container finished" podID="d97a0adc-34fd-4fbc-aa41-56d51e7eb170" containerID="5aae94ef3b4d56d980be83958fc352314dcd46ca846bf408e174a5b834da9b25" exitCode=0 Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.564244 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b5fd-account-create-update-wr4hv" event={"ID":"d97a0adc-34fd-4fbc-aa41-56d51e7eb170","Type":"ContainerDied","Data":"5aae94ef3b4d56d980be83958fc352314dcd46ca846bf408e174a5b834da9b25"} Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.566398 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5qrb9" event={"ID":"d192b635-1876-423f-83a9-68bc9ab9ba98","Type":"ContainerDied","Data":"fa25bc43486827f5e364893b23292a9fcad43d5e039f074de0ffce732e6e3430"} Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.566444 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa25bc43486827f5e364893b23292a9fcad43d5e039f074de0ffce732e6e3430" Mar 18 17:16:16 crc kubenswrapper[4939]: I0318 17:16:16.566500 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5qrb9" Mar 18 17:16:17 crc kubenswrapper[4939]: I0318 17:16:17.986345 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.074031 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzc9m\" (UniqueName: \"kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m\") pod \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.074132 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts\") pod \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\" (UID: \"d97a0adc-34fd-4fbc-aa41-56d51e7eb170\") " Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.074549 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d97a0adc-34fd-4fbc-aa41-56d51e7eb170" (UID: "d97a0adc-34fd-4fbc-aa41-56d51e7eb170"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.079767 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m" (OuterVolumeSpecName: "kube-api-access-wzc9m") pod "d97a0adc-34fd-4fbc-aa41-56d51e7eb170" (UID: "d97a0adc-34fd-4fbc-aa41-56d51e7eb170"). InnerVolumeSpecName "kube-api-access-wzc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.177189 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzc9m\" (UniqueName: \"kubernetes.io/projected/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-kube-api-access-wzc9m\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.177230 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97a0adc-34fd-4fbc-aa41-56d51e7eb170-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.595054 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b5fd-account-create-update-wr4hv" event={"ID":"d97a0adc-34fd-4fbc-aa41-56d51e7eb170","Type":"ContainerDied","Data":"a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3"} Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.595080 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b5fd-account-create-update-wr4hv" Mar 18 17:16:18 crc kubenswrapper[4939]: I0318 17:16:18.595090 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0501d7d5a1af037e779e14227c1d6a0f01d749820acbaefb590d6ef0f0ba5c3" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.026062 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xwd4g"] Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.034406 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xwd4g"] Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.149726 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec22beb-3747-46a2-8eec-a2cf8ac77a0f" path="/var/lib/kubelet/pods/fec22beb-3747-46a2-8eec-a2cf8ac77a0f/volumes" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.244135 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-dw9cc"] Mar 18 17:16:20 crc kubenswrapper[4939]: E0318 17:16:20.244504 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192b635-1876-423f-83a9-68bc9ab9ba98" containerName="mariadb-database-create" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.244584 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192b635-1876-423f-83a9-68bc9ab9ba98" containerName="mariadb-database-create" Mar 18 17:16:20 crc kubenswrapper[4939]: E0318 17:16:20.244606 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97a0adc-34fd-4fbc-aa41-56d51e7eb170" containerName="mariadb-account-create-update" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.244613 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97a0adc-34fd-4fbc-aa41-56d51e7eb170" containerName="mariadb-account-create-update" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.244784 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d192b635-1876-423f-83a9-68bc9ab9ba98" containerName="mariadb-database-create" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.244813 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97a0adc-34fd-4fbc-aa41-56d51e7eb170" containerName="mariadb-account-create-update" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.245407 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.259721 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-dw9cc"] Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.412246 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gqt\" (UniqueName: \"kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.412439 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.514499 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gqt\" (UniqueName: \"kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.514616 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.515372 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.537911 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gqt\" (UniqueName: \"kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt\") pod \"octavia-persistence-db-create-dw9cc\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.566402 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.787606 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-41f8-account-create-update-7btrr"] Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.796739 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.829158 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-41f8-account-create-update-7btrr"] Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.835885 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.926115 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96s9f\" (UniqueName: \"kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:20 crc kubenswrapper[4939]: I0318 17:16:20.926480 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.028321 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.028537 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96s9f\" (UniqueName: \"kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.029381 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.047221 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96s9f\" (UniqueName: \"kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f\") pod \"octavia-41f8-account-create-update-7btrr\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.087483 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-dw9cc"] Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.125924 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.559472 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-41f8-account-create-update-7btrr"] Mar 18 17:16:21 crc kubenswrapper[4939]: W0318 17:16:21.560705 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b4007c_f7d3_47ad_9e5d_9b12886416f2.slice/crio-de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c WatchSource:0}: Error finding container de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c: Status 404 returned error can't find the container with id de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.630528 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41f8-account-create-update-7btrr" event={"ID":"62b4007c-f7d3-47ad-9e5d-9b12886416f2","Type":"ContainerStarted","Data":"de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c"} Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.632567 4939 generic.go:334] "Generic (PLEG): container finished" podID="8c056622-aeea-4e82-87d3-e147e22f1fc5" containerID="8aa4b82fdca6e2f6d0abddeb69aff932ca7c56fdac289db4ed399ace38cfd256" exitCode=0 Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.632626 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dw9cc" event={"ID":"8c056622-aeea-4e82-87d3-e147e22f1fc5","Type":"ContainerDied","Data":"8aa4b82fdca6e2f6d0abddeb69aff932ca7c56fdac289db4ed399ace38cfd256"} Mar 18 17:16:21 crc kubenswrapper[4939]: I0318 17:16:21.632660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dw9cc" event={"ID":"8c056622-aeea-4e82-87d3-e147e22f1fc5","Type":"ContainerStarted","Data":"02df53d377ba9f399539ca446c0fb72aab6e07d15ca66707b62859c83ed279ff"} Mar 18 17:16:22 crc kubenswrapper[4939]: E0318 17:16:22.041628 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b4007c_f7d3_47ad_9e5d_9b12886416f2.slice/crio-dcb7372718b4fe6090fcec1bb36cdf12aacbbf559cd22f635c2c37194aec8189.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b4007c_f7d3_47ad_9e5d_9b12886416f2.slice/crio-conmon-dcb7372718b4fe6090fcec1bb36cdf12aacbbf559cd22f635c2c37194aec8189.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:16:22 crc kubenswrapper[4939]: I0318 17:16:22.647681 4939 generic.go:334] "Generic (PLEG): container finished" podID="62b4007c-f7d3-47ad-9e5d-9b12886416f2" containerID="dcb7372718b4fe6090fcec1bb36cdf12aacbbf559cd22f635c2c37194aec8189" exitCode=0 Mar 18 17:16:22 crc kubenswrapper[4939]: I0318 17:16:22.647825 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41f8-account-create-update-7btrr" event={"ID":"62b4007c-f7d3-47ad-9e5d-9b12886416f2","Type":"ContainerDied","Data":"dcb7372718b4fe6090fcec1bb36cdf12aacbbf559cd22f635c2c37194aec8189"} Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.017731 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.133426 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:16:23 crc kubenswrapper[4939]: E0318 17:16:23.133834 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.170009 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4gqt\" (UniqueName: \"kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt\") pod \"8c056622-aeea-4e82-87d3-e147e22f1fc5\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.170303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts\") pod \"8c056622-aeea-4e82-87d3-e147e22f1fc5\" (UID: \"8c056622-aeea-4e82-87d3-e147e22f1fc5\") " Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.171697 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c056622-aeea-4e82-87d3-e147e22f1fc5" (UID: "8c056622-aeea-4e82-87d3-e147e22f1fc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.180925 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt" (OuterVolumeSpecName: "kube-api-access-n4gqt") pod "8c056622-aeea-4e82-87d3-e147e22f1fc5" (UID: "8c056622-aeea-4e82-87d3-e147e22f1fc5"). InnerVolumeSpecName "kube-api-access-n4gqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.273787 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4gqt\" (UniqueName: \"kubernetes.io/projected/8c056622-aeea-4e82-87d3-e147e22f1fc5-kube-api-access-n4gqt\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.273822 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c056622-aeea-4e82-87d3-e147e22f1fc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.660975 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dw9cc" event={"ID":"8c056622-aeea-4e82-87d3-e147e22f1fc5","Type":"ContainerDied","Data":"02df53d377ba9f399539ca446c0fb72aab6e07d15ca66707b62859c83ed279ff"} Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.661039 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02df53d377ba9f399539ca446c0fb72aab6e07d15ca66707b62859c83ed279ff" Mar 18 17:16:23 crc kubenswrapper[4939]: I0318 17:16:23.661000 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dw9cc" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.051076 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.191874 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96s9f\" (UniqueName: \"kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f\") pod \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.192054 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts\") pod \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\" (UID: \"62b4007c-f7d3-47ad-9e5d-9b12886416f2\") " Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.193056 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62b4007c-f7d3-47ad-9e5d-9b12886416f2" (UID: "62b4007c-f7d3-47ad-9e5d-9b12886416f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.198041 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f" (OuterVolumeSpecName: "kube-api-access-96s9f") pod "62b4007c-f7d3-47ad-9e5d-9b12886416f2" (UID: "62b4007c-f7d3-47ad-9e5d-9b12886416f2"). InnerVolumeSpecName "kube-api-access-96s9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.294181 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96s9f\" (UniqueName: \"kubernetes.io/projected/62b4007c-f7d3-47ad-9e5d-9b12886416f2-kube-api-access-96s9f\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.294220 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b4007c-f7d3-47ad-9e5d-9b12886416f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.671543 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41f8-account-create-update-7btrr" event={"ID":"62b4007c-f7d3-47ad-9e5d-9b12886416f2","Type":"ContainerDied","Data":"de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c"} Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.671883 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4f50055d9da6fd2bb43e37d3f7164bbe9db8a7a9f2a2dab3631d50abd1897c" Mar 18 17:16:24 crc kubenswrapper[4939]: I0318 17:16:24.671598 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41f8-account-create-update-7btrr" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.126119 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7499f59567-gx45r"] Mar 18 17:16:26 crc kubenswrapper[4939]: E0318 17:16:26.126620 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b4007c-f7d3-47ad-9e5d-9b12886416f2" containerName="mariadb-account-create-update" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.126636 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b4007c-f7d3-47ad-9e5d-9b12886416f2" containerName="mariadb-account-create-update" Mar 18 17:16:26 crc kubenswrapper[4939]: E0318 17:16:26.126658 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c056622-aeea-4e82-87d3-e147e22f1fc5" containerName="mariadb-database-create" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.126666 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c056622-aeea-4e82-87d3-e147e22f1fc5" containerName="mariadb-database-create" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.126915 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c056622-aeea-4e82-87d3-e147e22f1fc5" containerName="mariadb-database-create" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.126938 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b4007c-f7d3-47ad-9e5d-9b12886416f2" containerName="mariadb-account-create-update" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.128494 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.130661 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.131000 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-j9c8m" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.140438 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.161823 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7499f59567-gx45r"] Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.246753 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-scripts\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.246826 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.247107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-combined-ca-bundle\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.247345 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-octavia-run\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.247448 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data-merged\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.349614 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-octavia-run\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.349689 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data-merged\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.349751 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-scripts\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.349805 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.349862 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-combined-ca-bundle\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.350162 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-octavia-run\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.351327 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data-merged\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.356421 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-combined-ca-bundle\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.358065 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-scripts\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.360199 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74caed9c-f767-4733-8a9e-8ffc8ed6bf1d-config-data\") pod \"octavia-api-7499f59567-gx45r\" (UID: \"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d\") " pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:26 crc kubenswrapper[4939]: I0318 17:16:26.455040 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:27 crc kubenswrapper[4939]: I0318 17:16:27.025527 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7499f59567-gx45r"] Mar 18 17:16:27 crc kubenswrapper[4939]: W0318 17:16:27.035711 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74caed9c_f767_4733_8a9e_8ffc8ed6bf1d.slice/crio-0be0a4b16f83cf553e427ce4bb6a6094d7ae2107d330a31951fbbf10dff8f75a WatchSource:0}: Error finding container 0be0a4b16f83cf553e427ce4bb6a6094d7ae2107d330a31951fbbf10dff8f75a: Status 404 returned error can't find the container with id 0be0a4b16f83cf553e427ce4bb6a6094d7ae2107d330a31951fbbf10dff8f75a Mar 18 17:16:27 crc kubenswrapper[4939]: I0318 17:16:27.696604 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7499f59567-gx45r" event={"ID":"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d","Type":"ContainerStarted","Data":"0be0a4b16f83cf553e427ce4bb6a6094d7ae2107d330a31951fbbf10dff8f75a"} Mar 18 17:16:35 crc kubenswrapper[4939]: I0318 17:16:35.133603 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:16:35 crc kubenswrapper[4939]: E0318 17:16:35.134739 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:16:36 crc kubenswrapper[4939]: I0318 17:16:36.794640 4939 generic.go:334] "Generic (PLEG): container finished" podID="74caed9c-f767-4733-8a9e-8ffc8ed6bf1d" containerID="d9efde537fd53ba12a2069b4ac62663bd3838cd9e89207ba83969ed88aef183e" exitCode=0 Mar 18 17:16:36 crc kubenswrapper[4939]: I0318 17:16:36.794765 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7499f59567-gx45r" event={"ID":"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d","Type":"ContainerDied","Data":"d9efde537fd53ba12a2069b4ac62663bd3838cd9e89207ba83969ed88aef183e"} Mar 18 17:16:37 crc kubenswrapper[4939]: I0318 17:16:37.805949 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7499f59567-gx45r" event={"ID":"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d","Type":"ContainerStarted","Data":"4953dcb402d4ab0f6a65f736ae3acd3cf918aa93b731d2015e46c4f3e646b6b8"} Mar 18 17:16:37 crc kubenswrapper[4939]: I0318 17:16:37.806391 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:37 crc kubenswrapper[4939]: I0318 17:16:37.806411 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7499f59567-gx45r" event={"ID":"74caed9c-f767-4733-8a9e-8ffc8ed6bf1d","Type":"ContainerStarted","Data":"f674bef89d7891905468ef39374f049bbcfe9224a165678a76d7e98850d1c3e0"} Mar 18 17:16:37 crc kubenswrapper[4939]: I0318 17:16:37.836067 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7499f59567-gx45r" podStartSLOduration=3.187047654 podStartE2EDuration="11.836048984s" podCreationTimestamp="2026-03-18 17:16:26 +0000 UTC" firstStartedPulling="2026-03-18 17:16:27.037412522 +0000 UTC m=+5951.636600143" lastFinishedPulling="2026-03-18 17:16:35.686413852 +0000 UTC m=+5960.285601473" observedRunningTime="2026-03-18 17:16:37.835217071 +0000 UTC m=+5962.434404692" watchObservedRunningTime="2026-03-18 17:16:37.836048984 +0000 UTC m=+5962.435236605" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.220184 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-26h2t" podUID="c3acd6a2-80fc-4c56-b460-187b80f55cfb" containerName="ovn-controller" probeResult="failure" output=< Mar 18 17:16:38 crc kubenswrapper[4939]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 17:16:38 crc kubenswrapper[4939]: > Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.280263 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.286861 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vnbc8" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.442244 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-26h2t-config-jqckb"] Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.443926 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.446941 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.471610 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t-config-jqckb"] Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496262 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496322 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496397 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496478 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496720 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbpvg\" (UniqueName: \"kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.496981 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.597900 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbpvg\" (UniqueName: \"kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.597975 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598044 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598087 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598109 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598699 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598711 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598763 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.598768 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.600655 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.637627 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbpvg\" (UniqueName: \"kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg\") pod \"ovn-controller-26h2t-config-jqckb\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.780172 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:38 crc kubenswrapper[4939]: I0318 17:16:38.845627 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:16:39 crc kubenswrapper[4939]: W0318 17:16:39.292859 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba385c0d_21f5_4507_8983_2f1023b20516.slice/crio-b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c WatchSource:0}: Error finding container b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c: Status 404 returned error can't find the container with id b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c Mar 18 17:16:39 crc kubenswrapper[4939]: I0318 17:16:39.310915 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t-config-jqckb"] Mar 18 17:16:39 crc kubenswrapper[4939]: I0318 17:16:39.871152 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-jqckb" event={"ID":"ba385c0d-21f5-4507-8983-2f1023b20516","Type":"ContainerStarted","Data":"0df02c559a9ab54350408f200e1bf646305f5e78160a741519d119f78760c2e0"} Mar 18 17:16:39 crc kubenswrapper[4939]: I0318 17:16:39.871555 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-jqckb" event={"ID":"ba385c0d-21f5-4507-8983-2f1023b20516","Type":"ContainerStarted","Data":"b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c"} Mar 18 17:16:39 crc kubenswrapper[4939]: I0318 17:16:39.896941 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-26h2t-config-jqckb" podStartSLOduration=1.896919441 podStartE2EDuration="1.896919441s" podCreationTimestamp="2026-03-18 17:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:39.891638661 +0000 UTC m=+5964.490826282" watchObservedRunningTime="2026-03-18 17:16:39.896919441 +0000 UTC m=+5964.496107062" Mar 18 17:16:40 crc kubenswrapper[4939]: I0318 17:16:40.908283 4939 generic.go:334] "Generic (PLEG): container finished" podID="ba385c0d-21f5-4507-8983-2f1023b20516" containerID="0df02c559a9ab54350408f200e1bf646305f5e78160a741519d119f78760c2e0" exitCode=0 Mar 18 17:16:40 crc kubenswrapper[4939]: I0318 17:16:40.908614 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-jqckb" event={"ID":"ba385c0d-21f5-4507-8983-2f1023b20516","Type":"ContainerDied","Data":"0df02c559a9ab54350408f200e1bf646305f5e78160a741519d119f78760c2e0"} Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.350090 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.371482 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.371883 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372013 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372085 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372120 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run" (OuterVolumeSpecName: "var-run") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372127 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372144 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372175 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372239 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbpvg\" (UniqueName: \"kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg\") pod \"ba385c0d-21f5-4507-8983-2f1023b20516\" (UID: \"ba385c0d-21f5-4507-8983-2f1023b20516\") " Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372635 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372698 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372712 4939 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372721 4939 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ba385c0d-21f5-4507-8983-2f1023b20516-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.372783 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts" (OuterVolumeSpecName: "scripts") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.378167 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg" (OuterVolumeSpecName: "kube-api-access-xbpvg") pod "ba385c0d-21f5-4507-8983-2f1023b20516" (UID: "ba385c0d-21f5-4507-8983-2f1023b20516"). InnerVolumeSpecName "kube-api-access-xbpvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.473594 4939 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.473636 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbpvg\" (UniqueName: \"kubernetes.io/projected/ba385c0d-21f5-4507-8983-2f1023b20516-kube-api-access-xbpvg\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.473651 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba385c0d-21f5-4507-8983-2f1023b20516-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.931792 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-jqckb" event={"ID":"ba385c0d-21f5-4507-8983-2f1023b20516","Type":"ContainerDied","Data":"b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c"} Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.931847 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-jqckb" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.931850 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27b228498f3b7b6f095cc4a74bd67cacc019d78a648d4b4baf11361d5eb2e0c" Mar 18 17:16:42 crc kubenswrapper[4939]: I0318 17:16:42.986842 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-26h2t-config-jqckb"] Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.002807 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-26h2t-config-jqckb"] Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.111626 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-26h2t-config-gq7jf"] Mar 18 17:16:43 crc kubenswrapper[4939]: E0318 17:16:43.112118 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba385c0d-21f5-4507-8983-2f1023b20516" containerName="ovn-config" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.112141 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba385c0d-21f5-4507-8983-2f1023b20516" containerName="ovn-config" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.112390 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba385c0d-21f5-4507-8983-2f1023b20516" containerName="ovn-config" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.113166 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.120615 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t-config-gq7jf"] Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.163437 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.190964 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.191031 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r429\" (UniqueName: \"kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.191062 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.191112 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.191147 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.191168 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.229384 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-26h2t" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292023 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292270 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r429\" (UniqueName: \"kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292299 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292345 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292382 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292405 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.292642 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.293202 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.293490 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.293621 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.295010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.311251 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r429\" (UniqueName: \"kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429\") pod \"ovn-controller-26h2t-config-gq7jf\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.476762 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.922602 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26h2t-config-gq7jf"] Mar 18 17:16:43 crc kubenswrapper[4939]: I0318 17:16:43.940326 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-gq7jf" event={"ID":"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d","Type":"ContainerStarted","Data":"441d04f809d3ce9d18dc70184099619f3978cdebbedabdf15d1fad69f53ac740"} Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.161699 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba385c0d-21f5-4507-8983-2f1023b20516" path="/var/lib/kubelet/pods/ba385c0d-21f5-4507-8983-2f1023b20516/volumes" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.425167 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-mfm6m"] Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.427072 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.428643 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.429225 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.429422 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.434825 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-mfm6m"] Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.513065 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.513263 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-scripts\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.513301 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data-merged\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.513340 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/173ed316-31b6-4fe5-928d-d9f2f3d92f01-hm-ports\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.615404 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-scripts\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.615448 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data-merged\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.615479 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/173ed316-31b6-4fe5-928d-d9f2f3d92f01-hm-ports\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.615599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.616317 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data-merged\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.616814 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/173ed316-31b6-4fe5-928d-d9f2f3d92f01-hm-ports\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.623617 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-scripts\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.642016 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173ed316-31b6-4fe5-928d-d9f2f3d92f01-config-data\") pod \"octavia-rsyslog-mfm6m\" (UID: \"173ed316-31b6-4fe5-928d-d9f2f3d92f01\") " pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.744522 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.951026 4939 generic.go:334] "Generic (PLEG): container finished" podID="6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" containerID="3240d6a7a2fc78391cdae9f8fa68e721842316e23132fd163d046b6a5705d877" exitCode=0 Mar 18 17:16:44 crc kubenswrapper[4939]: I0318 17:16:44.951364 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-gq7jf" event={"ID":"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d","Type":"ContainerDied","Data":"3240d6a7a2fc78391cdae9f8fa68e721842316e23132fd163d046b6a5705d877"} Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.037879 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.039847 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.042306 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.059987 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.125391 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.125492 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.228215 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.228631 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.229187 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.247254 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config\") pod \"octavia-image-upload-59f8cff499-bzl42\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.307692 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-mfm6m"] Mar 18 17:16:45 crc kubenswrapper[4939]: W0318 17:16:45.311109 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173ed316_31b6_4fe5_928d_d9f2f3d92f01.slice/crio-cd92a9caac7115390ceb0aa4d507cc3d597e9ef9c701e63037b26fd536fddbbc WatchSource:0}: Error finding container cd92a9caac7115390ceb0aa4d507cc3d597e9ef9c701e63037b26fd536fddbbc: Status 404 returned error can't find the container with id cd92a9caac7115390ceb0aa4d507cc3d597e9ef9c701e63037b26fd536fddbbc Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.371733 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.490483 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-mfm6m"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.848676 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.899499 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-nzmc9"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.901937 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.904099 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.909747 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-nzmc9"] Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.945872 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.946143 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.946410 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.946555 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.959843 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-mfm6m" event={"ID":"173ed316-31b6-4fe5-928d-d9f2f3d92f01","Type":"ContainerStarted","Data":"cd92a9caac7115390ceb0aa4d507cc3d597e9ef9c701e63037b26fd536fddbbc"} Mar 18 17:16:45 crc kubenswrapper[4939]: I0318 17:16:45.960979 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerStarted","Data":"cc7cba3a816bb29b30dafcc075d35e3427c529637e818752fc21fbefbe8e73eb"} Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.048630 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.048734 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.048872 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.049018 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.050993 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.056856 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.057468 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.057769 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle\") pod \"octavia-db-sync-nzmc9\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.231288 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.328432 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354474 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354497 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r429\" (UniqueName: \"kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354568 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354571 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354673 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.354698 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn\") pod \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\" (UID: \"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d\") " Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.355062 4939 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.355102 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.355128 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run" (OuterVolumeSpecName: "var-run") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.355213 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.355612 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts" (OuterVolumeSpecName: "scripts") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.360460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429" (OuterVolumeSpecName: "kube-api-access-6r429") pod "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" (UID: "6c3a87b4-bd9f-45ff-ad4a-73199de63d0d"). InnerVolumeSpecName "kube-api-access-6r429". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.457540 4939 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.457586 4939 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.457602 4939 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.457615 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.457628 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r429\" (UniqueName: \"kubernetes.io/projected/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d-kube-api-access-6r429\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.754453 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-nzmc9"] Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.974417 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26h2t-config-gq7jf" event={"ID":"6c3a87b4-bd9f-45ff-ad4a-73199de63d0d","Type":"ContainerDied","Data":"441d04f809d3ce9d18dc70184099619f3978cdebbedabdf15d1fad69f53ac740"} Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.974465 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="441d04f809d3ce9d18dc70184099619f3978cdebbedabdf15d1fad69f53ac740" Mar 18 17:16:46 crc kubenswrapper[4939]: I0318 17:16:46.974528 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26h2t-config-gq7jf" Mar 18 17:16:47 crc kubenswrapper[4939]: I0318 17:16:47.408393 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-26h2t-config-gq7jf"] Mar 18 17:16:47 crc kubenswrapper[4939]: I0318 17:16:47.422736 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-26h2t-config-gq7jf"] Mar 18 17:16:47 crc kubenswrapper[4939]: I0318 17:16:47.984124 4939 generic.go:334] "Generic (PLEG): container finished" podID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerID="df067ebb9b8e271ad4d1c9add53af1f6839408ee92221a122ce31fa46294e217" exitCode=0 Mar 18 17:16:47 crc kubenswrapper[4939]: I0318 17:16:47.985518 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nzmc9" event={"ID":"21184d1f-a09e-4dc0-82d7-ae468b35ea5d","Type":"ContainerDied","Data":"df067ebb9b8e271ad4d1c9add53af1f6839408ee92221a122ce31fa46294e217"} Mar 18 17:16:47 crc kubenswrapper[4939]: I0318 17:16:47.985649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nzmc9" event={"ID":"21184d1f-a09e-4dc0-82d7-ae468b35ea5d","Type":"ContainerStarted","Data":"2c9b0afa8cb6721d16a2dfe13090ae6b670038fc2d5d919f7eb90bcc26cc1736"} Mar 18 17:16:48 crc kubenswrapper[4939]: I0318 17:16:48.133601 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:16:48 crc kubenswrapper[4939]: E0318 17:16:48.134096 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:16:48 crc kubenswrapper[4939]: I0318 17:16:48.144587 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" path="/var/lib/kubelet/pods/6c3a87b4-bd9f-45ff-ad4a-73199de63d0d/volumes" Mar 18 17:16:48 crc kubenswrapper[4939]: I0318 17:16:48.996052 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nzmc9" event={"ID":"21184d1f-a09e-4dc0-82d7-ae468b35ea5d","Type":"ContainerStarted","Data":"a16a8cce335f614963bcd245e46a37bbfd3839680f851239b16183abe352880a"} Mar 18 17:16:49 crc kubenswrapper[4939]: I0318 17:16:49.015993 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-nzmc9" podStartSLOduration=4.015973071 podStartE2EDuration="4.015973071s" podCreationTimestamp="2026-03-18 17:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:49.014385366 +0000 UTC m=+5973.613572987" watchObservedRunningTime="2026-03-18 17:16:49.015973071 +0000 UTC m=+5973.615160692" Mar 18 17:16:50 crc kubenswrapper[4939]: I0318 17:16:50.007345 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-mfm6m" event={"ID":"173ed316-31b6-4fe5-928d-d9f2f3d92f01","Type":"ContainerStarted","Data":"b87847fb902223e25866a351176084f31ebbd7a37034a0eeb74b4f21aa1c9dc2"} Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.027024 4939 generic.go:334] "Generic (PLEG): container finished" podID="173ed316-31b6-4fe5-928d-d9f2f3d92f01" containerID="b87847fb902223e25866a351176084f31ebbd7a37034a0eeb74b4f21aa1c9dc2" exitCode=0 Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.027112 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-mfm6m" event={"ID":"173ed316-31b6-4fe5-928d-d9f2f3d92f01","Type":"ContainerDied","Data":"b87847fb902223e25866a351176084f31ebbd7a37034a0eeb74b4f21aa1c9dc2"} Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.029872 4939 generic.go:334] "Generic (PLEG): container finished" podID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerID="a16a8cce335f614963bcd245e46a37bbfd3839680f851239b16183abe352880a" exitCode=0 Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.029912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nzmc9" event={"ID":"21184d1f-a09e-4dc0-82d7-ae468b35ea5d","Type":"ContainerDied","Data":"a16a8cce335f614963bcd245e46a37bbfd3839680f851239b16183abe352880a"} Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.728642 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:16:52 crc kubenswrapper[4939]: E0318 17:16:52.729468 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" containerName="ovn-config" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.729488 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" containerName="ovn-config" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.729714 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3a87b4-bd9f-45ff-ad4a-73199de63d0d" containerName="ovn-config" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.733111 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.743678 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.885366 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqbd\" (UniqueName: \"kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.885598 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.886056 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.988617 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.989045 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.989251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqbd\" (UniqueName: \"kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.989346 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:52 crc kubenswrapper[4939]: I0318 17:16:52.989192 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:53 crc kubenswrapper[4939]: I0318 17:16:53.009703 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqbd\" (UniqueName: \"kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd\") pod \"certified-operators-hdlrp\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:53 crc kubenswrapper[4939]: I0318 17:16:53.062639 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.107636 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nzmc9" event={"ID":"21184d1f-a09e-4dc0-82d7-ae468b35ea5d","Type":"ContainerDied","Data":"2c9b0afa8cb6721d16a2dfe13090ae6b670038fc2d5d919f7eb90bcc26cc1736"} Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.108081 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9b0afa8cb6721d16a2dfe13090ae6b670038fc2d5d919f7eb90bcc26cc1736" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.159151 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.262832 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged\") pod \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.262883 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts\") pod \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.262922 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle\") pod \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.263029 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data\") pod \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\" (UID: \"21184d1f-a09e-4dc0-82d7-ae468b35ea5d\") " Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.269174 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts" (OuterVolumeSpecName: "scripts") pod "21184d1f-a09e-4dc0-82d7-ae468b35ea5d" (UID: "21184d1f-a09e-4dc0-82d7-ae468b35ea5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.274086 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data" (OuterVolumeSpecName: "config-data") pod "21184d1f-a09e-4dc0-82d7-ae468b35ea5d" (UID: "21184d1f-a09e-4dc0-82d7-ae468b35ea5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.288327 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "21184d1f-a09e-4dc0-82d7-ae468b35ea5d" (UID: "21184d1f-a09e-4dc0-82d7-ae468b35ea5d"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.297959 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21184d1f-a09e-4dc0-82d7-ae468b35ea5d" (UID: "21184d1f-a09e-4dc0-82d7-ae468b35ea5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.365106 4939 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.365136 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.365145 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.365153 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21184d1f-a09e-4dc0-82d7-ae468b35ea5d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:56 crc kubenswrapper[4939]: I0318 17:16:56.834951 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:16:56 crc kubenswrapper[4939]: W0318 17:16:56.842490 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb12421_4f8c_4b15_bfed_4d5b3b76927e.slice/crio-d2beb7935c0857201d042f7b925adc4dcd2a138f35e864ac7ebfb82088593409 WatchSource:0}: Error finding container d2beb7935c0857201d042f7b925adc4dcd2a138f35e864ac7ebfb82088593409: Status 404 returned error can't find the container with id d2beb7935c0857201d042f7b925adc4dcd2a138f35e864ac7ebfb82088593409 Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.135234 4939 generic.go:334] "Generic (PLEG): container finished" podID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerID="015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6" exitCode=0 Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.135333 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerDied","Data":"015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6"} Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.135618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerStarted","Data":"d2beb7935c0857201d042f7b925adc4dcd2a138f35e864ac7ebfb82088593409"} Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.140294 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerStarted","Data":"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d"} Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.143353 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nzmc9" Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.143417 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-mfm6m" event={"ID":"173ed316-31b6-4fe5-928d-d9f2f3d92f01","Type":"ContainerStarted","Data":"5781fd4d8ba9e834d980eb9dbae13b024aa5c3a353125229cbc79c591eb2fb27"} Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.145120 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:16:57 crc kubenswrapper[4939]: I0318 17:16:57.196518 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-mfm6m" podStartSLOduration=1.762802761 podStartE2EDuration="13.196476469s" podCreationTimestamp="2026-03-18 17:16:44 +0000 UTC" firstStartedPulling="2026-03-18 17:16:45.313823722 +0000 UTC m=+5969.913011353" lastFinishedPulling="2026-03-18 17:16:56.74749744 +0000 UTC m=+5981.346685061" observedRunningTime="2026-03-18 17:16:57.185354234 +0000 UTC m=+5981.784541875" watchObservedRunningTime="2026-03-18 17:16:57.196476469 +0000 UTC m=+5981.795664090" Mar 18 17:16:58 crc kubenswrapper[4939]: I0318 17:16:58.156877 4939 generic.go:334] "Generic (PLEG): container finished" podID="b56c433e-e600-46f3-ae97-c95924797844" containerID="c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d" exitCode=0 Mar 18 17:16:58 crc kubenswrapper[4939]: I0318 17:16:58.156957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerDied","Data":"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d"} Mar 18 17:17:00 crc kubenswrapper[4939]: I0318 17:17:00.133842 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:17:00 crc kubenswrapper[4939]: E0318 17:17:00.134454 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:17:02 crc kubenswrapper[4939]: I0318 17:17:02.095084 4939 scope.go:117] "RemoveContainer" containerID="38f1e62a3d057a060dd9306d1e98c0ccb4ac938d075be71053596b0e2bfa4d93" Mar 18 17:17:02 crc kubenswrapper[4939]: I0318 17:17:02.523233 4939 scope.go:117] "RemoveContainer" containerID="581fcb91747891802ddd8b601ff1828e2326878661ff58b49f79189540eeefc5" Mar 18 17:17:03 crc kubenswrapper[4939]: I0318 17:17:03.451634 4939 scope.go:117] "RemoveContainer" containerID="f6c3c0e5583adf4acfcef5b73e9ec4520dc7fc613f44f0f90425361a8bafe69a" Mar 18 17:17:05 crc kubenswrapper[4939]: I0318 17:17:05.050543 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:17:05 crc kubenswrapper[4939]: I0318 17:17:05.061770 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7499f59567-gx45r" Mar 18 17:17:05 crc kubenswrapper[4939]: I0318 17:17:05.233378 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerStarted","Data":"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d"} Mar 18 17:17:05 crc kubenswrapper[4939]: I0318 17:17:05.237362 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerStarted","Data":"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca"} Mar 18 17:17:05 crc kubenswrapper[4939]: I0318 17:17:05.273913 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-bzl42" podStartSLOduration=2.877260168 podStartE2EDuration="21.27389264s" podCreationTimestamp="2026-03-18 17:16:44 +0000 UTC" firstStartedPulling="2026-03-18 17:16:45.844481521 +0000 UTC m=+5970.443669152" lastFinishedPulling="2026-03-18 17:17:04.241113993 +0000 UTC m=+5988.840301624" observedRunningTime="2026-03-18 17:17:05.262833016 +0000 UTC m=+5989.862020677" watchObservedRunningTime="2026-03-18 17:17:05.27389264 +0000 UTC m=+5989.873080261" Mar 18 17:17:06 crc kubenswrapper[4939]: I0318 17:17:06.247475 4939 generic.go:334] "Generic (PLEG): container finished" podID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerID="6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d" exitCode=0 Mar 18 17:17:06 crc kubenswrapper[4939]: I0318 17:17:06.247555 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerDied","Data":"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d"} Mar 18 17:17:06 crc kubenswrapper[4939]: I0318 17:17:06.249808 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:17:07 crc kubenswrapper[4939]: I0318 17:17:07.271686 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerStarted","Data":"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98"} Mar 18 17:17:07 crc kubenswrapper[4939]: I0318 17:17:07.334983 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdlrp" podStartSLOduration=5.537650607 podStartE2EDuration="15.334960887s" podCreationTimestamp="2026-03-18 17:16:52 +0000 UTC" firstStartedPulling="2026-03-18 17:16:57.137632958 +0000 UTC m=+5981.736820569" lastFinishedPulling="2026-03-18 17:17:06.934943228 +0000 UTC m=+5991.534130849" observedRunningTime="2026-03-18 17:17:07.295022773 +0000 UTC m=+5991.894210394" watchObservedRunningTime="2026-03-18 17:17:07.334960887 +0000 UTC m=+5991.934148508" Mar 18 17:17:12 crc kubenswrapper[4939]: I0318 17:17:12.132932 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:17:12 crc kubenswrapper[4939]: E0318 17:17:12.133757 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:17:13 crc kubenswrapper[4939]: I0318 17:17:13.064915 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:13 crc kubenswrapper[4939]: I0318 17:17:13.065282 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:13 crc kubenswrapper[4939]: I0318 17:17:13.129271 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:13 crc kubenswrapper[4939]: I0318 17:17:13.454885 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:13 crc kubenswrapper[4939]: I0318 17:17:13.508842 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:17:14 crc kubenswrapper[4939]: I0318 17:17:14.774551 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-mfm6m" Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.340415 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdlrp" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="registry-server" containerID="cri-o://7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98" gracePeriod=2 Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.848082 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.976561 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content\") pod \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.976735 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities\") pod \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.976825 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqbd\" (UniqueName: \"kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd\") pod \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\" (UID: \"4fb12421-4f8c-4b15-bfed-4d5b3b76927e\") " Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.977842 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities" (OuterVolumeSpecName: "utilities") pod "4fb12421-4f8c-4b15-bfed-4d5b3b76927e" (UID: "4fb12421-4f8c-4b15-bfed-4d5b3b76927e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:17:15 crc kubenswrapper[4939]: I0318 17:17:15.983966 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd" (OuterVolumeSpecName: "kube-api-access-5wqbd") pod "4fb12421-4f8c-4b15-bfed-4d5b3b76927e" (UID: "4fb12421-4f8c-4b15-bfed-4d5b3b76927e"). InnerVolumeSpecName "kube-api-access-5wqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.024441 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fb12421-4f8c-4b15-bfed-4d5b3b76927e" (UID: "4fb12421-4f8c-4b15-bfed-4d5b3b76927e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.079141 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqbd\" (UniqueName: \"kubernetes.io/projected/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-kube-api-access-5wqbd\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.079171 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.079182 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fb12421-4f8c-4b15-bfed-4d5b3b76927e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.352705 4939 generic.go:334] "Generic (PLEG): container finished" podID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerID="7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98" exitCode=0 Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.352804 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerDied","Data":"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98"} Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.353311 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlrp" event={"ID":"4fb12421-4f8c-4b15-bfed-4d5b3b76927e","Type":"ContainerDied","Data":"d2beb7935c0857201d042f7b925adc4dcd2a138f35e864ac7ebfb82088593409"} Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.353364 4939 scope.go:117] "RemoveContainer" containerID="7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.352879 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlrp" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.378637 4939 scope.go:117] "RemoveContainer" containerID="6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.392307 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.404116 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdlrp"] Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.404452 4939 scope.go:117] "RemoveContainer" containerID="015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.472477 4939 scope.go:117] "RemoveContainer" containerID="7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98" Mar 18 17:17:16 crc kubenswrapper[4939]: E0318 17:17:16.472903 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98\": container with ID starting with 7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98 not found: ID does not exist" containerID="7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.472938 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98"} err="failed to get container status \"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98\": rpc error: code = NotFound desc = could not find container \"7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98\": container with ID starting with 7c91dace8de11f1dfa2b987a35503dc781605519464a2591d6a17ed1c2837f98 not found: ID does not exist" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.472962 4939 scope.go:117] "RemoveContainer" containerID="6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d" Mar 18 17:17:16 crc kubenswrapper[4939]: E0318 17:17:16.473316 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d\": container with ID starting with 6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d not found: ID does not exist" containerID="6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.473342 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d"} err="failed to get container status \"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d\": rpc error: code = NotFound desc = could not find container \"6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d\": container with ID starting with 6aa52561228bd59da66410de8cb05ef32df33a3eaed834946aac7f17176d795d not found: ID does not exist" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.473359 4939 scope.go:117] "RemoveContainer" containerID="015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6" Mar 18 17:17:16 crc kubenswrapper[4939]: E0318 17:17:16.473760 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6\": container with ID starting with 015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6 not found: ID does not exist" containerID="015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6" Mar 18 17:17:16 crc kubenswrapper[4939]: I0318 17:17:16.473787 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6"} err="failed to get container status \"015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6\": rpc error: code = NotFound desc = could not find container \"015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6\": container with ID starting with 015ed0ae88318dbb4e15052a8ea9a3237c77d31cdfdc443f0583c5c0e3a04bf6 not found: ID does not exist" Mar 18 17:17:18 crc kubenswrapper[4939]: I0318 17:17:18.149920 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" path="/var/lib/kubelet/pods/4fb12421-4f8c-4b15-bfed-4d5b3b76927e/volumes" Mar 18 17:17:27 crc kubenswrapper[4939]: I0318 17:17:27.133380 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:17:27 crc kubenswrapper[4939]: E0318 17:17:27.134347 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:17:30 crc kubenswrapper[4939]: I0318 17:17:30.690550 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:17:30 crc kubenswrapper[4939]: I0318 17:17:30.691561 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-bzl42" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="octavia-amphora-httpd" containerID="cri-o://38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca" gracePeriod=30 Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.174026 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.281073 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image\") pod \"b56c433e-e600-46f3-ae97-c95924797844\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.281303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config\") pod \"b56c433e-e600-46f3-ae97-c95924797844\" (UID: \"b56c433e-e600-46f3-ae97-c95924797844\") " Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.307819 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b56c433e-e600-46f3-ae97-c95924797844" (UID: "b56c433e-e600-46f3-ae97-c95924797844"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.342242 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "b56c433e-e600-46f3-ae97-c95924797844" (UID: "b56c433e-e600-46f3-ae97-c95924797844"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.383565 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b56c433e-e600-46f3-ae97-c95924797844-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.383618 4939 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b56c433e-e600-46f3-ae97-c95924797844-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.489783 4939 generic.go:334] "Generic (PLEG): container finished" podID="b56c433e-e600-46f3-ae97-c95924797844" containerID="38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca" exitCode=0 Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.489827 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerDied","Data":"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca"} Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.489854 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-bzl42" event={"ID":"b56c433e-e600-46f3-ae97-c95924797844","Type":"ContainerDied","Data":"cc7cba3a816bb29b30dafcc075d35e3427c529637e818752fc21fbefbe8e73eb"} Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.489873 4939 scope.go:117] "RemoveContainer" containerID="38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.489867 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-bzl42" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.523637 4939 scope.go:117] "RemoveContainer" containerID="c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.531977 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.539498 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-bzl42"] Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.556261 4939 scope.go:117] "RemoveContainer" containerID="38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca" Mar 18 17:17:31 crc kubenswrapper[4939]: E0318 17:17:31.557731 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca\": container with ID starting with 38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca not found: ID does not exist" containerID="38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.557767 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca"} err="failed to get container status \"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca\": rpc error: code = NotFound desc = could not find container \"38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca\": container with ID starting with 38208c46813c75b5b60f64065d96f18abf59e577d4406b155587db0c3cb3b9ca not found: ID does not exist" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.557791 4939 scope.go:117] "RemoveContainer" containerID="c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d" Mar 18 17:17:31 crc kubenswrapper[4939]: E0318 17:17:31.558058 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d\": container with ID starting with c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d not found: ID does not exist" containerID="c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d" Mar 18 17:17:31 crc kubenswrapper[4939]: I0318 17:17:31.558078 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d"} err="failed to get container status \"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d\": rpc error: code = NotFound desc = could not find container \"c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d\": container with ID starting with c80129606dbafee650722cf1c4fd47f84f01c505a6cdecf5272ac5388918433d not found: ID does not exist" Mar 18 17:17:32 crc kubenswrapper[4939]: I0318 17:17:32.149848 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56c433e-e600-46f3-ae97-c95924797844" path="/var/lib/kubelet/pods/b56c433e-e600-46f3-ae97-c95924797844/volumes" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.089544 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-w8t75"] Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090055 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="octavia-amphora-httpd" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090070 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="octavia-amphora-httpd" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090084 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="extract-utilities" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090092 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="extract-utilities" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090116 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="registry-server" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090125 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="registry-server" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090139 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="init" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090147 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="init" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090159 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerName="octavia-db-sync" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090165 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerName="octavia-db-sync" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090192 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="extract-content" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090199 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="extract-content" Mar 18 17:17:37 crc kubenswrapper[4939]: E0318 17:17:37.090210 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerName="init" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090217 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerName="init" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090441 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" containerName="octavia-db-sync" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090457 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56c433e-e600-46f3-ae97-c95924797844" containerName="octavia-amphora-httpd" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.090482 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb12421-4f8c-4b15-bfed-4d5b3b76927e" containerName="registry-server" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.091685 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.095485 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.123910 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-w8t75"] Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.208275 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5037cf17-2a06-4fa1-a218-1b12e963b853-httpd-config\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.208375 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5037cf17-2a06-4fa1-a218-1b12e963b853-amphora-image\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.310459 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5037cf17-2a06-4fa1-a218-1b12e963b853-httpd-config\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.310578 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5037cf17-2a06-4fa1-a218-1b12e963b853-amphora-image\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.311278 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5037cf17-2a06-4fa1-a218-1b12e963b853-amphora-image\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.317223 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5037cf17-2a06-4fa1-a218-1b12e963b853-httpd-config\") pod \"octavia-image-upload-59f8cff499-w8t75\" (UID: \"5037cf17-2a06-4fa1-a218-1b12e963b853\") " pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.416809 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-w8t75" Mar 18 17:17:37 crc kubenswrapper[4939]: I0318 17:17:37.856860 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-w8t75"] Mar 18 17:17:38 crc kubenswrapper[4939]: I0318 17:17:38.566797 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-w8t75" event={"ID":"5037cf17-2a06-4fa1-a218-1b12e963b853","Type":"ContainerStarted","Data":"f482228b1a8a84a3b3fd5895956dffa7d441eb89b2605ca8a9f9a6b59ade202c"} Mar 18 17:17:39 crc kubenswrapper[4939]: I0318 17:17:39.580092 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-w8t75" event={"ID":"5037cf17-2a06-4fa1-a218-1b12e963b853","Type":"ContainerStarted","Data":"e33ffa8d74de312446f9b4fca5b04efc88176281fe032e233b43c76399f092a7"} Mar 18 17:17:40 crc kubenswrapper[4939]: I0318 17:17:40.593477 4939 generic.go:334] "Generic (PLEG): container finished" podID="5037cf17-2a06-4fa1-a218-1b12e963b853" containerID="e33ffa8d74de312446f9b4fca5b04efc88176281fe032e233b43c76399f092a7" exitCode=0 Mar 18 17:17:40 crc kubenswrapper[4939]: I0318 17:17:40.593577 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-w8t75" event={"ID":"5037cf17-2a06-4fa1-a218-1b12e963b853","Type":"ContainerDied","Data":"e33ffa8d74de312446f9b4fca5b04efc88176281fe032e233b43c76399f092a7"} Mar 18 17:17:41 crc kubenswrapper[4939]: I0318 17:17:41.133434 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:17:41 crc kubenswrapper[4939]: E0318 17:17:41.133784 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:17:42 crc kubenswrapper[4939]: I0318 17:17:42.611171 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-w8t75" event={"ID":"5037cf17-2a06-4fa1-a218-1b12e963b853","Type":"ContainerStarted","Data":"e24755d8ba20e98bd767ddc99f5fa61cc7f3a1d3262449662654ec214d4758ee"} Mar 18 17:17:42 crc kubenswrapper[4939]: I0318 17:17:42.625160 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-w8t75" podStartSLOduration=1.294164611 podStartE2EDuration="5.625142525s" podCreationTimestamp="2026-03-18 17:17:37 +0000 UTC" firstStartedPulling="2026-03-18 17:17:37.863513812 +0000 UTC m=+6022.462701433" lastFinishedPulling="2026-03-18 17:17:42.194491726 +0000 UTC m=+6026.793679347" observedRunningTime="2026-03-18 17:17:42.621876313 +0000 UTC m=+6027.221063924" watchObservedRunningTime="2026-03-18 17:17:42.625142525 +0000 UTC m=+6027.224330146" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.032164 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-ngrvz"] Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.035177 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.038129 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.038291 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.039936 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.047076 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-ngrvz"] Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.152511 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-scripts\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.152585 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.152614 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-hm-ports\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.152936 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data-merged\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.153147 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-combined-ca-bundle\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.153271 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-amphora-certs\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.255697 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data-merged\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.255783 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-combined-ca-bundle\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.255856 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-amphora-certs\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.256004 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-scripts\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.256046 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.256078 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-hm-ports\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.257332 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data-merged\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.257397 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-hm-ports\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.262187 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-scripts\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.262264 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-config-data\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.263539 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-combined-ca-bundle\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.274128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef-amphora-certs\") pod \"octavia-healthmanager-ngrvz\" (UID: \"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef\") " pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.376231 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:50 crc kubenswrapper[4939]: I0318 17:17:50.961657 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-ngrvz"] Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.704814 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-ngrvz" event={"ID":"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef","Type":"ContainerStarted","Data":"019da48f6bcb60fd41fe49ca15d60e7a3f70376e359378e28515e826f5a9d12c"} Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.705149 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-ngrvz" event={"ID":"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef","Type":"ContainerStarted","Data":"987c56b943227ef562075bfcbd77fe46efaf88cbb02830d5298acd50707f8913"} Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.899432 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-k2wvl"] Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.901657 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.903981 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.904243 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.919402 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-k2wvl"] Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.991675 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data-merged\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.991908 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-hm-ports\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.992010 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-scripts\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.992029 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-amphora-certs\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.992214 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:51 crc kubenswrapper[4939]: I0318 17:17:51.992268 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-combined-ca-bundle\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.093936 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-scripts\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.093991 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-amphora-certs\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.094056 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.094076 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-combined-ca-bundle\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.094185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data-merged\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.094261 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-hm-ports\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.094785 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data-merged\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.095292 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-hm-ports\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.102447 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-config-data\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.103263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-amphora-certs\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.105093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-combined-ca-bundle\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.105192 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbedb8e8-2834-4e62-97a8-8a9df6ab3091-scripts\") pod \"octavia-housekeeping-k2wvl\" (UID: \"bbedb8e8-2834-4e62-97a8-8a9df6ab3091\") " pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.219955 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:52 crc kubenswrapper[4939]: W0318 17:17:52.752468 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbedb8e8_2834_4e62_97a8_8a9df6ab3091.slice/crio-a3545296eb841d852693fbcaea1acfdc14bb6bbc88d519c5daa709c7b8a81e1c WatchSource:0}: Error finding container a3545296eb841d852693fbcaea1acfdc14bb6bbc88d519c5daa709c7b8a81e1c: Status 404 returned error can't find the container with id a3545296eb841d852693fbcaea1acfdc14bb6bbc88d519c5daa709c7b8a81e1c Mar 18 17:17:52 crc kubenswrapper[4939]: I0318 17:17:52.753564 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-k2wvl"] Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.044070 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-vhvzw"] Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.046091 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.048439 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.048816 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.064197 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-vhvzw"] Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118269 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-scripts\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118351 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-combined-ca-bundle\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-config-data\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118800 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-amphora-certs\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118904 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/860e2b80-254e-4caa-813d-3e0b040d3798-config-data-merged\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.118973 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/860e2b80-254e-4caa-813d-3e0b040d3798-hm-ports\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221277 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-combined-ca-bundle\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221391 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-config-data\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221475 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-amphora-certs\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221518 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/860e2b80-254e-4caa-813d-3e0b040d3798-config-data-merged\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221550 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/860e2b80-254e-4caa-813d-3e0b040d3798-hm-ports\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.221614 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-scripts\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.222315 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/860e2b80-254e-4caa-813d-3e0b040d3798-config-data-merged\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.224206 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/860e2b80-254e-4caa-813d-3e0b040d3798-hm-ports\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.230476 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-combined-ca-bundle\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.231417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-amphora-certs\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.232308 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-config-data\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.233328 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/860e2b80-254e-4caa-813d-3e0b040d3798-scripts\") pod \"octavia-worker-vhvzw\" (UID: \"860e2b80-254e-4caa-813d-3e0b040d3798\") " pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.370937 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-vhvzw" Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.726297 4939 generic.go:334] "Generic (PLEG): container finished" podID="0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef" containerID="019da48f6bcb60fd41fe49ca15d60e7a3f70376e359378e28515e826f5a9d12c" exitCode=0 Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.726395 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-ngrvz" event={"ID":"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef","Type":"ContainerDied","Data":"019da48f6bcb60fd41fe49ca15d60e7a3f70376e359378e28515e826f5a9d12c"} Mar 18 17:17:53 crc kubenswrapper[4939]: I0318 17:17:53.728634 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-k2wvl" event={"ID":"bbedb8e8-2834-4e62-97a8-8a9df6ab3091","Type":"ContainerStarted","Data":"a3545296eb841d852693fbcaea1acfdc14bb6bbc88d519c5daa709c7b8a81e1c"} Mar 18 17:17:54 crc kubenswrapper[4939]: I0318 17:17:54.036355 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-vhvzw"] Mar 18 17:17:54 crc kubenswrapper[4939]: I0318 17:17:54.741492 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vhvzw" event={"ID":"860e2b80-254e-4caa-813d-3e0b040d3798","Type":"ContainerStarted","Data":"50b0b7d1a1f1f790ce892460d539233d5c70dad66e65a75b10e06667a1d64a84"} Mar 18 17:17:54 crc kubenswrapper[4939]: I0318 17:17:54.745072 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-ngrvz" event={"ID":"0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef","Type":"ContainerStarted","Data":"4ad6f2a82b65b15398c53dbbdeeb1a9b469991a6f4dc4c0a5a3adffc00d55c25"} Mar 18 17:17:54 crc kubenswrapper[4939]: I0318 17:17:54.745304 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:17:54 crc kubenswrapper[4939]: I0318 17:17:54.778163 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-ngrvz" podStartSLOduration=4.77814126 podStartE2EDuration="4.77814126s" podCreationTimestamp="2026-03-18 17:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:17:54.770476542 +0000 UTC m=+6039.369664183" watchObservedRunningTime="2026-03-18 17:17:54.77814126 +0000 UTC m=+6039.377328881" Mar 18 17:17:55 crc kubenswrapper[4939]: I0318 17:17:55.133118 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:17:55 crc kubenswrapper[4939]: E0318 17:17:55.133434 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:17:55 crc kubenswrapper[4939]: I0318 17:17:55.334956 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-ngrvz"] Mar 18 17:17:55 crc kubenswrapper[4939]: I0318 17:17:55.762138 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-k2wvl" event={"ID":"bbedb8e8-2834-4e62-97a8-8a9df6ab3091","Type":"ContainerStarted","Data":"35e73f113c1b881d8bbfe39d79e2bed21f156279b9fa960c296703c00f1d75f6"} Mar 18 17:17:56 crc kubenswrapper[4939]: I0318 17:17:56.785716 4939 generic.go:334] "Generic (PLEG): container finished" podID="bbedb8e8-2834-4e62-97a8-8a9df6ab3091" containerID="35e73f113c1b881d8bbfe39d79e2bed21f156279b9fa960c296703c00f1d75f6" exitCode=0 Mar 18 17:17:56 crc kubenswrapper[4939]: I0318 17:17:56.785820 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-k2wvl" event={"ID":"bbedb8e8-2834-4e62-97a8-8a9df6ab3091","Type":"ContainerDied","Data":"35e73f113c1b881d8bbfe39d79e2bed21f156279b9fa960c296703c00f1d75f6"} Mar 18 17:17:56 crc kubenswrapper[4939]: I0318 17:17:56.791232 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vhvzw" event={"ID":"860e2b80-254e-4caa-813d-3e0b040d3798","Type":"ContainerStarted","Data":"dc34eac0d0749b9a1170d40b62307098e340403e2b30bf045f615c83712fdc3f"} Mar 18 17:17:57 crc kubenswrapper[4939]: I0318 17:17:57.804192 4939 generic.go:334] "Generic (PLEG): container finished" podID="860e2b80-254e-4caa-813d-3e0b040d3798" containerID="dc34eac0d0749b9a1170d40b62307098e340403e2b30bf045f615c83712fdc3f" exitCode=0 Mar 18 17:17:57 crc kubenswrapper[4939]: I0318 17:17:57.804288 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vhvzw" event={"ID":"860e2b80-254e-4caa-813d-3e0b040d3798","Type":"ContainerDied","Data":"dc34eac0d0749b9a1170d40b62307098e340403e2b30bf045f615c83712fdc3f"} Mar 18 17:17:57 crc kubenswrapper[4939]: I0318 17:17:57.810561 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-k2wvl" event={"ID":"bbedb8e8-2834-4e62-97a8-8a9df6ab3091","Type":"ContainerStarted","Data":"241c07853f6450ed239a7aea9fcf9fb9317aa9d9aeaf43b7cca181579b51a920"} Mar 18 17:17:57 crc kubenswrapper[4939]: I0318 17:17:57.810728 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:17:57 crc kubenswrapper[4939]: I0318 17:17:57.874069 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-k2wvl" podStartSLOduration=4.954802773 podStartE2EDuration="6.874047713s" podCreationTimestamp="2026-03-18 17:17:51 +0000 UTC" firstStartedPulling="2026-03-18 17:17:52.755676848 +0000 UTC m=+6037.354864469" lastFinishedPulling="2026-03-18 17:17:54.674921788 +0000 UTC m=+6039.274109409" observedRunningTime="2026-03-18 17:17:57.867832896 +0000 UTC m=+6042.467020527" watchObservedRunningTime="2026-03-18 17:17:57.874047713 +0000 UTC m=+6042.473235344" Mar 18 17:17:58 crc kubenswrapper[4939]: I0318 17:17:58.824642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vhvzw" event={"ID":"860e2b80-254e-4caa-813d-3e0b040d3798","Type":"ContainerStarted","Data":"130adbb2841396dc9ec4d24802fab7a34fba07138ed87700c668186ef42fcbe6"} Mar 18 17:17:58 crc kubenswrapper[4939]: I0318 17:17:58.850420 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-vhvzw" podStartSLOduration=5.334501861 podStartE2EDuration="6.850402088s" podCreationTimestamp="2026-03-18 17:17:52 +0000 UTC" firstStartedPulling="2026-03-18 17:17:54.158993508 +0000 UTC m=+6038.758181139" lastFinishedPulling="2026-03-18 17:17:55.674893745 +0000 UTC m=+6040.274081366" observedRunningTime="2026-03-18 17:17:58.841244748 +0000 UTC m=+6043.440432369" watchObservedRunningTime="2026-03-18 17:17:58.850402088 +0000 UTC m=+6043.449589709" Mar 18 17:17:59 crc kubenswrapper[4939]: I0318 17:17:59.834840 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-vhvzw" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.193938 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564238-5t7v4"] Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.200350 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.203001 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.203967 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.204280 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-5t7v4"] Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.204746 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.390927 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p429p\" (UniqueName: \"kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p\") pod \"auto-csr-approver-29564238-5t7v4\" (UID: \"d4a51ae8-f644-40f8-9c42-75150f336923\") " pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.493175 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p429p\" (UniqueName: \"kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p\") pod \"auto-csr-approver-29564238-5t7v4\" (UID: \"d4a51ae8-f644-40f8-9c42-75150f336923\") " pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.514114 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p429p\" (UniqueName: \"kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p\") pod \"auto-csr-approver-29564238-5t7v4\" (UID: \"d4a51ae8-f644-40f8-9c42-75150f336923\") " pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.523586 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:00 crc kubenswrapper[4939]: I0318 17:18:00.991193 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-5t7v4"] Mar 18 17:18:01 crc kubenswrapper[4939]: W0318 17:18:01.004064 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4a51ae8_f644_40f8_9c42_75150f336923.slice/crio-8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2 WatchSource:0}: Error finding container 8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2: Status 404 returned error can't find the container with id 8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2 Mar 18 17:18:01 crc kubenswrapper[4939]: I0318 17:18:01.856963 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" event={"ID":"d4a51ae8-f644-40f8-9c42-75150f336923","Type":"ContainerStarted","Data":"8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2"} Mar 18 17:18:02 crc kubenswrapper[4939]: I0318 17:18:02.868264 4939 generic.go:334] "Generic (PLEG): container finished" podID="d4a51ae8-f644-40f8-9c42-75150f336923" containerID="a8acbdce59ccfaf1cbfb2e990b6c05577de7978407cc275a74502a1afb8379a3" exitCode=0 Mar 18 17:18:02 crc kubenswrapper[4939]: I0318 17:18:02.868306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" event={"ID":"d4a51ae8-f644-40f8-9c42-75150f336923","Type":"ContainerDied","Data":"a8acbdce59ccfaf1cbfb2e990b6c05577de7978407cc275a74502a1afb8379a3"} Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.290001 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.409870 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p429p\" (UniqueName: \"kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p\") pod \"d4a51ae8-f644-40f8-9c42-75150f336923\" (UID: \"d4a51ae8-f644-40f8-9c42-75150f336923\") " Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.416666 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p" (OuterVolumeSpecName: "kube-api-access-p429p") pod "d4a51ae8-f644-40f8-9c42-75150f336923" (UID: "d4a51ae8-f644-40f8-9c42-75150f336923"). InnerVolumeSpecName "kube-api-access-p429p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.512664 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p429p\" (UniqueName: \"kubernetes.io/projected/d4a51ae8-f644-40f8-9c42-75150f336923-kube-api-access-p429p\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.895228 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" event={"ID":"d4a51ae8-f644-40f8-9c42-75150f336923","Type":"ContainerDied","Data":"8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2"} Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.895275 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-5t7v4" Mar 18 17:18:04 crc kubenswrapper[4939]: I0318 17:18:04.895279 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e72c45a88aa9357bf248fff2aa00eb6e90abfe9e28c795ee0a428c39c095cb2" Mar 18 17:18:05 crc kubenswrapper[4939]: I0318 17:18:05.363062 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-5qxbt"] Mar 18 17:18:05 crc kubenswrapper[4939]: I0318 17:18:05.373033 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-5qxbt"] Mar 18 17:18:05 crc kubenswrapper[4939]: I0318 17:18:05.405808 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-ngrvz" Mar 18 17:18:06 crc kubenswrapper[4939]: I0318 17:18:06.151789 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a2d568-f1e4-4408-addb-9401388c46c1" path="/var/lib/kubelet/pods/52a2d568-f1e4-4408-addb-9401388c46c1/volumes" Mar 18 17:18:07 crc kubenswrapper[4939]: I0318 17:18:07.254391 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-k2wvl" Mar 18 17:18:08 crc kubenswrapper[4939]: I0318 17:18:08.401478 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-vhvzw" Mar 18 17:18:09 crc kubenswrapper[4939]: I0318 17:18:09.134062 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:18:09 crc kubenswrapper[4939]: E0318 17:18:09.134571 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.884852 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:18:15 crc kubenswrapper[4939]: E0318 17:18:15.885747 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a51ae8-f644-40f8-9c42-75150f336923" containerName="oc" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.885760 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a51ae8-f644-40f8-9c42-75150f336923" containerName="oc" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.889374 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a51ae8-f644-40f8-9c42-75150f336923" containerName="oc" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.890581 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.894685 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.894861 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8s2zj" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.897262 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.897406 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.906816 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.943113 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.943434 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-log" containerID="cri-o://23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945" gracePeriod=30 Mar 18 17:18:15 crc kubenswrapper[4939]: I0318 17:18:15.943981 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-httpd" containerID="cri-o://9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375" gracePeriod=30 Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.004129 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.005796 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.015746 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.016005 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-log" containerID="cri-o://fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b" gracePeriod=30 Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.016066 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-httpd" containerID="cri-o://bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618" gracePeriod=30 Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.037976 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.038025 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrjbb\" (UniqueName: \"kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.038150 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.038295 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.038358 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.101332 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142201 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142252 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t45sz\" (UniqueName: \"kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142376 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142418 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142446 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142467 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrjbb\" (UniqueName: \"kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142624 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142680 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142706 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.142730 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.143260 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.151943 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.155125 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.158144 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.163418 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrjbb\" (UniqueName: \"kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.167009 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.167321 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data\") pod \"horizon-8657ff8dfc-kv9vt\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.219031 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8s2zj" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.226046 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.244035 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t45sz\" (UniqueName: \"kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.244101 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.244478 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.245017 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.245051 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.245098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.245955 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.246753 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.250286 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.261824 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t45sz\" (UniqueName: \"kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz\") pod \"horizon-78695f477-tfvsx\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.320610 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:16 crc kubenswrapper[4939]: W0318 17:18:16.691255 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b200d14_46d6_42bc_a168_55f2e409d7ac.slice/crio-3fa471d1f5cf7d503b03969dff300924203374f3db6915f2223e353c479ef009 WatchSource:0}: Error finding container 3fa471d1f5cf7d503b03969dff300924203374f3db6915f2223e353c479ef009: Status 404 returned error can't find the container with id 3fa471d1f5cf7d503b03969dff300924203374f3db6915f2223e353c479ef009 Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.694318 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.716386 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.742439 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.743991 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.761910 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.801884 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.860226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.860557 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.860596 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.860668 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.860689 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvrj\" (UniqueName: \"kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962095 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962141 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvrj\" (UniqueName: \"kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962175 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962292 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962325 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.962631 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.963159 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.963584 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.967644 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:16 crc kubenswrapper[4939]: I0318 17:18:16.988654 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvrj\" (UniqueName: \"kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj\") pod \"horizon-868b5b4f6c-vw95m\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.016416 4939 generic.go:334] "Generic (PLEG): container finished" podID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerID="fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b" exitCode=143 Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.016520 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerDied","Data":"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b"} Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.017876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerStarted","Data":"a6908c39a24929ff6234873de6198af96bbb934d406943651d7581bbdbc84915"} Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.019498 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerStarted","Data":"3fa471d1f5cf7d503b03969dff300924203374f3db6915f2223e353c479ef009"} Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.022144 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e677c8e-d5d5-40d9-94d7-334189284333" containerID="23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945" exitCode=143 Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.022180 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerDied","Data":"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945"} Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.071029 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:17 crc kubenswrapper[4939]: I0318 17:18:17.605713 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:18:17 crc kubenswrapper[4939]: W0318 17:18:17.609252 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491d1da7_8a7f_4c7f_9f4c_8aa5adef37d3.slice/crio-554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662 WatchSource:0}: Error finding container 554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662: Status 404 returned error can't find the container with id 554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662 Mar 18 17:18:18 crc kubenswrapper[4939]: I0318 17:18:18.036085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerStarted","Data":"554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662"} Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.761117 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.794317 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.934425 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.934553 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv9vh\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.934619 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.934878 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.934905 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935042 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935076 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935124 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935179 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64mg8\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935240 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935314 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935380 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935410 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data\") pod \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\" (UID: \"c18dafe7-8cf2-4296-bb22-e2fd12152ddf\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.935559 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts\") pod \"5e677c8e-d5d5-40d9-94d7-334189284333\" (UID: \"5e677c8e-d5d5-40d9-94d7-334189284333\") " Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.937818 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs" (OuterVolumeSpecName: "logs") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.938918 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.939227 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.939752 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs" (OuterVolumeSpecName: "logs") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.942106 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts" (OuterVolumeSpecName: "scripts") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.945843 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts" (OuterVolumeSpecName: "scripts") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.946234 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8" (OuterVolumeSpecName: "kube-api-access-64mg8") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "kube-api-access-64mg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.946869 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph" (OuterVolumeSpecName: "ceph") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.947476 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh" (OuterVolumeSpecName: "kube-api-access-bv9vh") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "kube-api-access-bv9vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.948755 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph" (OuterVolumeSpecName: "ceph") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.974054 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:19 crc kubenswrapper[4939]: I0318 17:18:19.992859 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.002220 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data" (OuterVolumeSpecName: "config-data") pod "5e677c8e-d5d5-40d9-94d7-334189284333" (UID: "5e677c8e-d5d5-40d9-94d7-334189284333"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.015423 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data" (OuterVolumeSpecName: "config-data") pod "c18dafe7-8cf2-4296-bb22-e2fd12152ddf" (UID: "c18dafe7-8cf2-4296-bb22-e2fd12152ddf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039261 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039297 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039308 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039320 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64mg8\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-kube-api-access-64mg8\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039329 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039339 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039348 4939 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e677c8e-d5d5-40d9-94d7-334189284333-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039356 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039364 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039372 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039380 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv9vh\" (UniqueName: \"kubernetes.io/projected/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-kube-api-access-bv9vh\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039388 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e677c8e-d5d5-40d9-94d7-334189284333-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039397 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c18dafe7-8cf2-4296-bb22-e2fd12152ddf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.039404 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e677c8e-d5d5-40d9-94d7-334189284333-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.070246 4939 generic.go:334] "Generic (PLEG): container finished" podID="5e677c8e-d5d5-40d9-94d7-334189284333" containerID="9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375" exitCode=0 Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.070308 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerDied","Data":"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375"} Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.070339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e677c8e-d5d5-40d9-94d7-334189284333","Type":"ContainerDied","Data":"115de82933ed262f765cdd96cfec7cea4ff58b8ac2235705a22477c58167eab8"} Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.070359 4939 scope.go:117] "RemoveContainer" containerID="9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.070530 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.084047 4939 generic.go:334] "Generic (PLEG): container finished" podID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerID="bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618" exitCode=0 Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.084289 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerDied","Data":"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618"} Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.084321 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c18dafe7-8cf2-4296-bb22-e2fd12152ddf","Type":"ContainerDied","Data":"f2a0c57dc83fbe57e495473d4d0c888f542eb05c66959752b736c4347585e1f6"} Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.084409 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.118645 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.145184 4939 scope.go:117] "RemoveContainer" containerID="23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.158437 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.158536 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.183951 4939 scope.go:117] "RemoveContainer" containerID="9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.187678 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375\": container with ID starting with 9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375 not found: ID does not exist" containerID="9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.187745 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375"} err="failed to get container status \"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375\": rpc error: code = NotFound desc = could not find container \"9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375\": container with ID starting with 9cca1d9a417de1948f2ac09394100e810dd04dbaf36beb0206b2a21642fa7375 not found: ID does not exist" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.187780 4939 scope.go:117] "RemoveContainer" containerID="23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.189384 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945\": container with ID starting with 23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945 not found: ID does not exist" containerID="23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.189424 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945"} err="failed to get container status \"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945\": rpc error: code = NotFound desc = could not find container \"23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945\": container with ID starting with 23ad5e9294d02f93077542c6d2c16ec31de739e050e16ac40fcf51df98db2945 not found: ID does not exist" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.189438 4939 scope.go:117] "RemoveContainer" containerID="bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.191684 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.202520 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.203155 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.203283 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.203394 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.203460 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.203565 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.203655 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.203723 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.203789 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.204108 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.204272 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.204392 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-log" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.204477 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" containerName="glance-httpd" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.206953 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.211273 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sb77r" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.213288 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.213650 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.218124 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.228716 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.231018 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.235744 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.250310 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.285975 4939 scope.go:117] "RemoveContainer" containerID="fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.311777 4939 scope.go:117] "RemoveContainer" containerID="bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.313543 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618\": container with ID starting with bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618 not found: ID does not exist" containerID="bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.313579 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618"} err="failed to get container status \"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618\": rpc error: code = NotFound desc = could not find container \"bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618\": container with ID starting with bc8a95cf88afce13446d5065df32e5131f9c2c6126a9578d371bdc4fc0c9d618 not found: ID does not exist" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.313599 4939 scope.go:117] "RemoveContainer" containerID="fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b" Mar 18 17:18:20 crc kubenswrapper[4939]: E0318 17:18:20.313858 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b\": container with ID starting with fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b not found: ID does not exist" containerID="fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.313877 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b"} err="failed to get container status \"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b\": rpc error: code = NotFound desc = could not find container \"fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b\": container with ID starting with fcb9cd88e8f352b564c8d119bd49e59ce2b5f7386f7f206c30e93a4a2dafb04b not found: ID does not exist" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363228 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363310 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363353 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363402 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363432 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363564 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363634 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363664 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-kube-api-access-r5zsk\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363722 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363744 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363894 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.363989 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.364051 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj85z\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-kube-api-access-bj85z\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465285 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj85z\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-kube-api-access-bj85z\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465356 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465386 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465422 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465440 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465463 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465560 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465593 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465612 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-kube-api-access-r5zsk\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465637 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465653 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465679 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465726 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465768 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.465858 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.466075 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79f6f675-b8f8-4aba-ad72-f22358057ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.466263 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.468566 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b2f1f6-9d42-465a-a193-b70288373cd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.472916 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.474249 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.474837 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.475616 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-ceph\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.475790 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.476010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.486768 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f6f675-b8f8-4aba-ad72-f22358057ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.487883 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b2f1f6-9d42-465a-a193-b70288373cd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.493611 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/60b2f1f6-9d42-465a-a193-b70288373cd3-kube-api-access-r5zsk\") pod \"glance-default-internal-api-0\" (UID: \"60b2f1f6-9d42-465a-a193-b70288373cd3\") " pod="openstack/glance-default-internal-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.497158 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj85z\" (UniqueName: \"kubernetes.io/projected/79f6f675-b8f8-4aba-ad72-f22358057ad0-kube-api-access-bj85z\") pod \"glance-default-external-api-0\" (UID: \"79f6f675-b8f8-4aba-ad72-f22358057ad0\") " pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.583115 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 17:18:20 crc kubenswrapper[4939]: I0318 17:18:20.598693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:21 crc kubenswrapper[4939]: I0318 17:18:21.232740 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 17:18:21 crc kubenswrapper[4939]: I0318 17:18:21.309219 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 17:18:22 crc kubenswrapper[4939]: I0318 17:18:22.156479 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e677c8e-d5d5-40d9-94d7-334189284333" path="/var/lib/kubelet/pods/5e677c8e-d5d5-40d9-94d7-334189284333/volumes" Mar 18 17:18:22 crc kubenswrapper[4939]: I0318 17:18:22.157691 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18dafe7-8cf2-4296-bb22-e2fd12152ddf" path="/var/lib/kubelet/pods/c18dafe7-8cf2-4296-bb22-e2fd12152ddf/volumes" Mar 18 17:18:23 crc kubenswrapper[4939]: I0318 17:18:23.134201 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:18:23 crc kubenswrapper[4939]: E0318 17:18:23.134528 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:18:25 crc kubenswrapper[4939]: W0318 17:18:25.701236 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f6f675_b8f8_4aba_ad72_f22358057ad0.slice/crio-c944c565fbaeda6bfde0143a68b66179b33bd1d3e7e76764ded4318075a5a854 WatchSource:0}: Error finding container c944c565fbaeda6bfde0143a68b66179b33bd1d3e7e76764ded4318075a5a854: Status 404 returned error can't find the container with id c944c565fbaeda6bfde0143a68b66179b33bd1d3e7e76764ded4318075a5a854 Mar 18 17:18:26 crc kubenswrapper[4939]: I0318 17:18:26.152230 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79f6f675-b8f8-4aba-ad72-f22358057ad0","Type":"ContainerStarted","Data":"c944c565fbaeda6bfde0143a68b66179b33bd1d3e7e76764ded4318075a5a854"} Mar 18 17:18:26 crc kubenswrapper[4939]: I0318 17:18:26.154306 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerStarted","Data":"3312e48ecb177192bacdba1fad46a00504efcb1dd4373992f48f8e74dd64eb89"} Mar 18 17:18:26 crc kubenswrapper[4939]: I0318 17:18:26.155776 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerStarted","Data":"f2ea74727b97feef06613596f223637ee9be2bc7a492aecd37ec15e371246b19"} Mar 18 17:18:26 crc kubenswrapper[4939]: I0318 17:18:26.157110 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerStarted","Data":"a61724fa3d5d8ddde53717a92c957a5fa38414da17a59085e9d9027755ee0980"} Mar 18 17:18:26 crc kubenswrapper[4939]: I0318 17:18:26.158424 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60b2f1f6-9d42-465a-a193-b70288373cd3","Type":"ContainerStarted","Data":"0f18f9d02b2ecfd70aa39a93adcd71451c8afa9f81a174a17541628baa4a13b4"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.171044 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerStarted","Data":"273de7b1bc874f18b38254eed8d6cf93d023354fb6949d56c72922eab0a5355a"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.176348 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60b2f1f6-9d42-465a-a193-b70288373cd3","Type":"ContainerStarted","Data":"40a0276a33ae1a2f6817c27ecba0d9b934194a224a7b6f215e09e76b098cf564"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.176585 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60b2f1f6-9d42-465a-a193-b70288373cd3","Type":"ContainerStarted","Data":"4a4d034f0a5176ac71675923e90b06e20b3b615f8735afe3cd386be9e702edac"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.178804 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79f6f675-b8f8-4aba-ad72-f22358057ad0","Type":"ContainerStarted","Data":"6c71feed4c5e3846bd95126886cfee74fa1c9fb8777b6582f160ec98fbee7fec"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.178920 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79f6f675-b8f8-4aba-ad72-f22358057ad0","Type":"ContainerStarted","Data":"886f4c3ee8cae83af9599783b2ed3d53ca600f5e52b89330d236f70b131283a2"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.180909 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerStarted","Data":"28d17527b3553e326ae4904ca87a1278ee7520f88c31d66d15698c2c6d5204d5"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.185244 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerStarted","Data":"34d701fa2ce03f6f150c62755f32459d4aa4f44546417c60017961732d6b5ddd"} Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.185356 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78695f477-tfvsx" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon-log" containerID="cri-o://f2ea74727b97feef06613596f223637ee9be2bc7a492aecd37ec15e371246b19" gracePeriod=30 Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.185453 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78695f477-tfvsx" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon" containerID="cri-o://34d701fa2ce03f6f150c62755f32459d4aa4f44546417c60017961732d6b5ddd" gracePeriod=30 Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.205761 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8657ff8dfc-kv9vt" podStartSLOduration=3.114026279 podStartE2EDuration="12.205743742s" podCreationTimestamp="2026-03-18 17:18:15 +0000 UTC" firstStartedPulling="2026-03-18 17:18:16.693315795 +0000 UTC m=+6061.292503416" lastFinishedPulling="2026-03-18 17:18:25.785033268 +0000 UTC m=+6070.384220879" observedRunningTime="2026-03-18 17:18:27.198664541 +0000 UTC m=+6071.797852192" watchObservedRunningTime="2026-03-18 17:18:27.205743742 +0000 UTC m=+6071.804931363" Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.231032 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-868b5b4f6c-vw95m" podStartSLOduration=3.010983858 podStartE2EDuration="11.231011019s" podCreationTimestamp="2026-03-18 17:18:16 +0000 UTC" firstStartedPulling="2026-03-18 17:18:17.611146637 +0000 UTC m=+6062.210334258" lastFinishedPulling="2026-03-18 17:18:25.831173798 +0000 UTC m=+6070.430361419" observedRunningTime="2026-03-18 17:18:27.219644796 +0000 UTC m=+6071.818832427" watchObservedRunningTime="2026-03-18 17:18:27.231011019 +0000 UTC m=+6071.830198640" Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.252931 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.252910451 podStartE2EDuration="7.252910451s" podCreationTimestamp="2026-03-18 17:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:18:27.243319539 +0000 UTC m=+6071.842507160" watchObservedRunningTime="2026-03-18 17:18:27.252910451 +0000 UTC m=+6071.852098082" Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.261673 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78695f477-tfvsx" podStartSLOduration=3.268945708 podStartE2EDuration="12.261652769s" podCreationTimestamp="2026-03-18 17:18:15 +0000 UTC" firstStartedPulling="2026-03-18 17:18:16.816660968 +0000 UTC m=+6061.415848589" lastFinishedPulling="2026-03-18 17:18:25.809368029 +0000 UTC m=+6070.408555650" observedRunningTime="2026-03-18 17:18:27.259466757 +0000 UTC m=+6071.858654378" watchObservedRunningTime="2026-03-18 17:18:27.261652769 +0000 UTC m=+6071.860840390" Mar 18 17:18:27 crc kubenswrapper[4939]: I0318 17:18:27.284882 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.284858298 podStartE2EDuration="7.284858298s" podCreationTimestamp="2026-03-18 17:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:18:27.277306484 +0000 UTC m=+6071.876494105" watchObservedRunningTime="2026-03-18 17:18:27.284858298 +0000 UTC m=+6071.884045929" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.583841 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.584372 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.599166 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.599217 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.622245 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.635201 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.648025 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 17:18:30 crc kubenswrapper[4939]: I0318 17:18:30.672857 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:31 crc kubenswrapper[4939]: I0318 17:18:31.228744 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 17:18:31 crc kubenswrapper[4939]: I0318 17:18:31.228980 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:31 crc kubenswrapper[4939]: I0318 17:18:31.228996 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 17:18:31 crc kubenswrapper[4939]: I0318 17:18:31.229008 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.162146 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.172861 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.283596 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.283623 4939 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.361007 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 17:18:33 crc kubenswrapper[4939]: I0318 17:18:33.364940 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 17:18:35 crc kubenswrapper[4939]: I0318 17:18:35.133272 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:18:35 crc kubenswrapper[4939]: E0318 17:18:35.133785 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:18:36 crc kubenswrapper[4939]: I0318 17:18:36.227285 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:36 crc kubenswrapper[4939]: I0318 17:18:36.227675 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:36 crc kubenswrapper[4939]: I0318 17:18:36.228774 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.153:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.153:8080: connect: connection refused" Mar 18 17:18:36 crc kubenswrapper[4939]: I0318 17:18:36.321369 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:37 crc kubenswrapper[4939]: I0318 17:18:37.071415 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:37 crc kubenswrapper[4939]: I0318 17:18:37.072080 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:37 crc kubenswrapper[4939]: I0318 17:18:37.074720 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.155:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.155:8080: connect: connection refused" Mar 18 17:18:43 crc kubenswrapper[4939]: I0318 17:18:43.053341 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-257dw"] Mar 18 17:18:43 crc kubenswrapper[4939]: I0318 17:18:43.062535 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-257dw"] Mar 18 17:18:44 crc kubenswrapper[4939]: I0318 17:18:44.029970 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c3ac-account-create-update-jctwd"] Mar 18 17:18:44 crc kubenswrapper[4939]: I0318 17:18:44.041273 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c3ac-account-create-update-jctwd"] Mar 18 17:18:44 crc kubenswrapper[4939]: I0318 17:18:44.144316 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae14a4f-992c-45b1-aff8-d60663244ecb" path="/var/lib/kubelet/pods/4ae14a4f-992c-45b1-aff8-d60663244ecb/volumes" Mar 18 17:18:44 crc kubenswrapper[4939]: I0318 17:18:44.146071 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4" path="/var/lib/kubelet/pods/a64bcac5-11b3-4fe1-b9ae-fea7819cc2b4/volumes" Mar 18 17:18:48 crc kubenswrapper[4939]: I0318 17:18:48.033893 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:48 crc kubenswrapper[4939]: I0318 17:18:48.998422 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:49 crc kubenswrapper[4939]: I0318 17:18:49.134085 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:18:49 crc kubenswrapper[4939]: E0318 17:18:49.134543 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:18:49 crc kubenswrapper[4939]: I0318 17:18:49.709261 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.029708 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5p47h"] Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.043796 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5p47h"] Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.143991 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90dc96a-1ec7-430a-b6c6-520cf884cdd0" path="/var/lib/kubelet/pods/a90dc96a-1ec7-430a-b6c6-520cf884cdd0/volumes" Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.683249 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.760725 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.761203 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon-log" containerID="cri-o://a61724fa3d5d8ddde53717a92c957a5fa38414da17a59085e9d9027755ee0980" gracePeriod=30 Mar 18 17:18:50 crc kubenswrapper[4939]: I0318 17:18:50.761250 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" containerID="cri-o://273de7b1bc874f18b38254eed8d6cf93d023354fb6949d56c72922eab0a5355a" gracePeriod=30 Mar 18 17:18:54 crc kubenswrapper[4939]: I0318 17:18:54.529101 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerID="273de7b1bc874f18b38254eed8d6cf93d023354fb6949d56c72922eab0a5355a" exitCode=0 Mar 18 17:18:54 crc kubenswrapper[4939]: I0318 17:18:54.529207 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerDied","Data":"273de7b1bc874f18b38254eed8d6cf93d023354fb6949d56c72922eab0a5355a"} Mar 18 17:18:56 crc kubenswrapper[4939]: I0318 17:18:56.227840 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.153:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.153:8080: connect: connection refused" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.574915 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerID="34d701fa2ce03f6f150c62755f32459d4aa4f44546417c60017961732d6b5ddd" exitCode=137 Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.575192 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerID="f2ea74727b97feef06613596f223637ee9be2bc7a492aecd37ec15e371246b19" exitCode=137 Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.575008 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerDied","Data":"34d701fa2ce03f6f150c62755f32459d4aa4f44546417c60017961732d6b5ddd"} Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.575235 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerDied","Data":"f2ea74727b97feef06613596f223637ee9be2bc7a492aecd37ec15e371246b19"} Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.656562 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.763850 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data\") pod \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.763891 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs\") pod \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.763948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key\") pod \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.764028 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts\") pod \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.764386 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs" (OuterVolumeSpecName: "logs") pod "9a9c9659-c84d-4fba-abf6-a38e076e28f1" (UID: "9a9c9659-c84d-4fba-abf6-a38e076e28f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.764855 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t45sz\" (UniqueName: \"kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz\") pod \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\" (UID: \"9a9c9659-c84d-4fba-abf6-a38e076e28f1\") " Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.765534 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a9c9659-c84d-4fba-abf6-a38e076e28f1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.770339 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9a9c9659-c84d-4fba-abf6-a38e076e28f1" (UID: "9a9c9659-c84d-4fba-abf6-a38e076e28f1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.770545 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz" (OuterVolumeSpecName: "kube-api-access-t45sz") pod "9a9c9659-c84d-4fba-abf6-a38e076e28f1" (UID: "9a9c9659-c84d-4fba-abf6-a38e076e28f1"). InnerVolumeSpecName "kube-api-access-t45sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.787732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts" (OuterVolumeSpecName: "scripts") pod "9a9c9659-c84d-4fba-abf6-a38e076e28f1" (UID: "9a9c9659-c84d-4fba-abf6-a38e076e28f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.793777 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data" (OuterVolumeSpecName: "config-data") pod "9a9c9659-c84d-4fba-abf6-a38e076e28f1" (UID: "9a9c9659-c84d-4fba-abf6-a38e076e28f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.867340 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.867366 4939 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a9c9659-c84d-4fba-abf6-a38e076e28f1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.867391 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a9c9659-c84d-4fba-abf6-a38e076e28f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:57 crc kubenswrapper[4939]: I0318 17:18:57.867402 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t45sz\" (UniqueName: \"kubernetes.io/projected/9a9c9659-c84d-4fba-abf6-a38e076e28f1-kube-api-access-t45sz\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.597654 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78695f477-tfvsx" event={"ID":"9a9c9659-c84d-4fba-abf6-a38e076e28f1","Type":"ContainerDied","Data":"a6908c39a24929ff6234873de6198af96bbb934d406943651d7581bbdbc84915"} Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.598128 4939 scope.go:117] "RemoveContainer" containerID="34d701fa2ce03f6f150c62755f32459d4aa4f44546417c60017961732d6b5ddd" Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.597793 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78695f477-tfvsx" Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.635341 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.648538 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78695f477-tfvsx"] Mar 18 17:18:58 crc kubenswrapper[4939]: I0318 17:18:58.806971 4939 scope.go:117] "RemoveContainer" containerID="f2ea74727b97feef06613596f223637ee9be2bc7a492aecd37ec15e371246b19" Mar 18 17:19:00 crc kubenswrapper[4939]: I0318 17:19:00.149559 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" path="/var/lib/kubelet/pods/9a9c9659-c84d-4fba-abf6-a38e076e28f1/volumes" Mar 18 17:19:03 crc kubenswrapper[4939]: I0318 17:19:03.133873 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:19:03 crc kubenswrapper[4939]: E0318 17:19:03.135109 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.277810 4939 scope.go:117] "RemoveContainer" containerID="d298f32db5b93b53a10bbc4f67e3b805a3468d18c812b237dc30b4d7d53ff443" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.299566 4939 scope.go:117] "RemoveContainer" containerID="2278590aeb881433ea3bffff1a741927abf083e4ec69f28af73d0f1dc76e99fc" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.334421 4939 scope.go:117] "RemoveContainer" containerID="8f7cd5378cd539f6ccc49a27282339c7344ba8f8254632a2ea2a085e95e0f982" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.365386 4939 scope.go:117] "RemoveContainer" containerID="3ead99fefcc65a8caa4e8e74118e63dbb7334590a98616cf69ceccc671e2cb74" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.421203 4939 scope.go:117] "RemoveContainer" containerID="7022bc45decbddcb0bbf5a6a605822340e2df53468e76a16886d61569553592c" Mar 18 17:19:04 crc kubenswrapper[4939]: I0318 17:19:04.484827 4939 scope.go:117] "RemoveContainer" containerID="c6cdc70a5722a77636708b3dbb6f7b1925aaa5fb363d13a963133f9a7039494e" Mar 18 17:19:06 crc kubenswrapper[4939]: I0318 17:19:06.227521 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.153:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.153:8080: connect: connection refused" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.315184 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:09 crc kubenswrapper[4939]: E0318 17:19:09.316220 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon-log" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.316235 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon-log" Mar 18 17:19:09 crc kubenswrapper[4939]: E0318 17:19:09.316262 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.316270 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.316440 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon-log" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.316452 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9c9659-c84d-4fba-abf6-a38e076e28f1" containerName="horizon" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.318116 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.332325 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.419650 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhrl\" (UniqueName: \"kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.419759 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.420354 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.522489 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.522616 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhrl\" (UniqueName: \"kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.522652 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.523039 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.523093 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.542148 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhrl\" (UniqueName: \"kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl\") pod \"community-operators-94zzq\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:09 crc kubenswrapper[4939]: I0318 17:19:09.641579 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:10 crc kubenswrapper[4939]: I0318 17:19:10.190228 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:10 crc kubenswrapper[4939]: W0318 17:19:10.196011 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a17b2b5_7a9c_4c4d_a9e2_9df75c82826c.slice/crio-c84a03db6a8e03f7374c95bdd16118b33ea841c37b10d0c130d1c56ce4773731 WatchSource:0}: Error finding container c84a03db6a8e03f7374c95bdd16118b33ea841c37b10d0c130d1c56ce4773731: Status 404 returned error can't find the container with id c84a03db6a8e03f7374c95bdd16118b33ea841c37b10d0c130d1c56ce4773731 Mar 18 17:19:10 crc kubenswrapper[4939]: I0318 17:19:10.714199 4939 generic.go:334] "Generic (PLEG): container finished" podID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerID="0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0" exitCode=0 Mar 18 17:19:10 crc kubenswrapper[4939]: I0318 17:19:10.714347 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerDied","Data":"0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0"} Mar 18 17:19:10 crc kubenswrapper[4939]: I0318 17:19:10.714430 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerStarted","Data":"c84a03db6a8e03f7374c95bdd16118b33ea841c37b10d0c130d1c56ce4773731"} Mar 18 17:19:11 crc kubenswrapper[4939]: I0318 17:19:11.726651 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerStarted","Data":"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161"} Mar 18 17:19:12 crc kubenswrapper[4939]: I0318 17:19:12.737634 4939 generic.go:334] "Generic (PLEG): container finished" podID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerID="317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161" exitCode=0 Mar 18 17:19:12 crc kubenswrapper[4939]: I0318 17:19:12.737689 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerDied","Data":"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161"} Mar 18 17:19:13 crc kubenswrapper[4939]: I0318 17:19:13.747969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerStarted","Data":"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c"} Mar 18 17:19:13 crc kubenswrapper[4939]: I0318 17:19:13.772696 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94zzq" podStartSLOduration=2.286946246 podStartE2EDuration="4.772679312s" podCreationTimestamp="2026-03-18 17:19:09 +0000 UTC" firstStartedPulling="2026-03-18 17:19:10.717184027 +0000 UTC m=+6115.316371648" lastFinishedPulling="2026-03-18 17:19:13.202917083 +0000 UTC m=+6117.802104714" observedRunningTime="2026-03-18 17:19:13.768445322 +0000 UTC m=+6118.367632963" watchObservedRunningTime="2026-03-18 17:19:13.772679312 +0000 UTC m=+6118.371866933" Mar 18 17:19:15 crc kubenswrapper[4939]: I0318 17:19:15.133250 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:19:15 crc kubenswrapper[4939]: E0318 17:19:15.133840 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:19:16 crc kubenswrapper[4939]: I0318 17:19:16.226793 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8657ff8dfc-kv9vt" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.153:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.153:8080: connect: connection refused" Mar 18 17:19:16 crc kubenswrapper[4939]: I0318 17:19:16.227168 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.075380 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f33d-account-create-update-ph9qq"] Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.086871 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-s8576"] Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.097942 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f33d-account-create-update-ph9qq"] Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.107697 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-s8576"] Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.154262 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325614c7-c425-4bfa-864b-142d5012d86e" path="/var/lib/kubelet/pods/325614c7-c425-4bfa-864b-142d5012d86e/volumes" Mar 18 17:19:18 crc kubenswrapper[4939]: I0318 17:19:18.156073 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d143b2a0-8e7a-429f-b7ac-48969f1d48da" path="/var/lib/kubelet/pods/d143b2a0-8e7a-429f-b7ac-48969f1d48da/volumes" Mar 18 17:19:19 crc kubenswrapper[4939]: I0318 17:19:19.642215 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:19 crc kubenswrapper[4939]: I0318 17:19:19.642696 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:19 crc kubenswrapper[4939]: I0318 17:19:19.718117 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:19 crc kubenswrapper[4939]: I0318 17:19:19.873179 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:19 crc kubenswrapper[4939]: I0318 17:19:19.955971 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:20 crc kubenswrapper[4939]: I0318 17:19:20.825439 4939 generic.go:334] "Generic (PLEG): container finished" podID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerID="a61724fa3d5d8ddde53717a92c957a5fa38414da17a59085e9d9027755ee0980" exitCode=137 Mar 18 17:19:20 crc kubenswrapper[4939]: I0318 17:19:20.825633 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerDied","Data":"a61724fa3d5d8ddde53717a92c957a5fa38414da17a59085e9d9027755ee0980"} Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.214999 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.304894 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts\") pod \"1b200d14-46d6-42bc-a168-55f2e409d7ac\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.304990 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key\") pod \"1b200d14-46d6-42bc-a168-55f2e409d7ac\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.305043 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data\") pod \"1b200d14-46d6-42bc-a168-55f2e409d7ac\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.305097 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrjbb\" (UniqueName: \"kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb\") pod \"1b200d14-46d6-42bc-a168-55f2e409d7ac\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.305154 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs\") pod \"1b200d14-46d6-42bc-a168-55f2e409d7ac\" (UID: \"1b200d14-46d6-42bc-a168-55f2e409d7ac\") " Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.306169 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs" (OuterVolumeSpecName: "logs") pod "1b200d14-46d6-42bc-a168-55f2e409d7ac" (UID: "1b200d14-46d6-42bc-a168-55f2e409d7ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.311188 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb" (OuterVolumeSpecName: "kube-api-access-rrjbb") pod "1b200d14-46d6-42bc-a168-55f2e409d7ac" (UID: "1b200d14-46d6-42bc-a168-55f2e409d7ac"). InnerVolumeSpecName "kube-api-access-rrjbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.315650 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1b200d14-46d6-42bc-a168-55f2e409d7ac" (UID: "1b200d14-46d6-42bc-a168-55f2e409d7ac"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.335398 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts" (OuterVolumeSpecName: "scripts") pod "1b200d14-46d6-42bc-a168-55f2e409d7ac" (UID: "1b200d14-46d6-42bc-a168-55f2e409d7ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.336196 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data" (OuterVolumeSpecName: "config-data") pod "1b200d14-46d6-42bc-a168-55f2e409d7ac" (UID: "1b200d14-46d6-42bc-a168-55f2e409d7ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.408010 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b200d14-46d6-42bc-a168-55f2e409d7ac-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.408041 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.408051 4939 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b200d14-46d6-42bc-a168-55f2e409d7ac-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.408061 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b200d14-46d6-42bc-a168-55f2e409d7ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.408070 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrjbb\" (UniqueName: \"kubernetes.io/projected/1b200d14-46d6-42bc-a168-55f2e409d7ac-kube-api-access-rrjbb\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.835829 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94zzq" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="registry-server" containerID="cri-o://fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c" gracePeriod=2 Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.836204 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8657ff8dfc-kv9vt" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.836755 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8657ff8dfc-kv9vt" event={"ID":"1b200d14-46d6-42bc-a168-55f2e409d7ac","Type":"ContainerDied","Data":"3fa471d1f5cf7d503b03969dff300924203374f3db6915f2223e353c479ef009"} Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.836840 4939 scope.go:117] "RemoveContainer" containerID="273de7b1bc874f18b38254eed8d6cf93d023354fb6949d56c72922eab0a5355a" Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.874254 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.889750 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8657ff8dfc-kv9vt"] Mar 18 17:19:21 crc kubenswrapper[4939]: I0318 17:19:21.998105 4939 scope.go:117] "RemoveContainer" containerID="a61724fa3d5d8ddde53717a92c957a5fa38414da17a59085e9d9027755ee0980" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.168018 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" path="/var/lib/kubelet/pods/1b200d14-46d6-42bc-a168-55f2e409d7ac/volumes" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.405223 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.530976 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mhrl\" (UniqueName: \"kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl\") pod \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.531101 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities\") pod \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.531136 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content\") pod \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\" (UID: \"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c\") " Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.532073 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities" (OuterVolumeSpecName: "utilities") pod "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" (UID: "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.536779 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl" (OuterVolumeSpecName: "kube-api-access-5mhrl") pod "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" (UID: "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c"). InnerVolumeSpecName "kube-api-access-5mhrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.620850 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" (UID: "1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.634136 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mhrl\" (UniqueName: \"kubernetes.io/projected/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-kube-api-access-5mhrl\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.634201 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.634222 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.854782 4939 generic.go:334] "Generic (PLEG): container finished" podID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerID="fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c" exitCode=0 Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.854851 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerDied","Data":"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c"} Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.855708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94zzq" event={"ID":"1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c","Type":"ContainerDied","Data":"c84a03db6a8e03f7374c95bdd16118b33ea841c37b10d0c130d1c56ce4773731"} Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.855753 4939 scope.go:117] "RemoveContainer" containerID="fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.854893 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94zzq" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.886952 4939 scope.go:117] "RemoveContainer" containerID="317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.915836 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.934495 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94zzq"] Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.943579 4939 scope.go:117] "RemoveContainer" containerID="0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.979616 4939 scope.go:117] "RemoveContainer" containerID="fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c" Mar 18 17:19:22 crc kubenswrapper[4939]: E0318 17:19:22.980199 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c\": container with ID starting with fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c not found: ID does not exist" containerID="fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.980236 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c"} err="failed to get container status \"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c\": rpc error: code = NotFound desc = could not find container \"fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c\": container with ID starting with fae1ebd701b1ebe30e4595cddab6b275401ce07a870a3f11cedb7a39bec7727c not found: ID does not exist" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.980258 4939 scope.go:117] "RemoveContainer" containerID="317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161" Mar 18 17:19:22 crc kubenswrapper[4939]: E0318 17:19:22.980740 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161\": container with ID starting with 317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161 not found: ID does not exist" containerID="317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.980802 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161"} err="failed to get container status \"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161\": rpc error: code = NotFound desc = could not find container \"317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161\": container with ID starting with 317b63b320788be40a7d29eab94fa8ca4acfced0d3d39195645b373d84203161 not found: ID does not exist" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.980838 4939 scope.go:117] "RemoveContainer" containerID="0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0" Mar 18 17:19:22 crc kubenswrapper[4939]: E0318 17:19:22.981172 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0\": container with ID starting with 0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0 not found: ID does not exist" containerID="0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0" Mar 18 17:19:22 crc kubenswrapper[4939]: I0318 17:19:22.981215 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0"} err="failed to get container status \"0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0\": rpc error: code = NotFound desc = could not find container \"0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0\": container with ID starting with 0856014cf705acf1bccd728a307c4b54a02dca77c3506a84117cabe8c99edaf0 not found: ID does not exist" Mar 18 17:19:24 crc kubenswrapper[4939]: I0318 17:19:24.157709 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" path="/var/lib/kubelet/pods/1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c/volumes" Mar 18 17:19:26 crc kubenswrapper[4939]: I0318 17:19:26.141754 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:19:26 crc kubenswrapper[4939]: E0318 17:19:26.143867 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:19:27 crc kubenswrapper[4939]: I0318 17:19:27.048292 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pdcb8"] Mar 18 17:19:27 crc kubenswrapper[4939]: I0318 17:19:27.056093 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pdcb8"] Mar 18 17:19:28 crc kubenswrapper[4939]: I0318 17:19:28.170969 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78edb94-7549-4392-b989-2373d238e37b" path="/var/lib/kubelet/pods/b78edb94-7549-4392-b989-2373d238e37b/volumes" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.028734 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d866dd55-p86pc"] Mar 18 17:19:33 crc kubenswrapper[4939]: E0318 17:19:33.029564 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="extract-content" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029647 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="extract-content" Mar 18 17:19:33 crc kubenswrapper[4939]: E0318 17:19:33.029663 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="extract-utilities" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029669 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="extract-utilities" Mar 18 17:19:33 crc kubenswrapper[4939]: E0318 17:19:33.029683 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon-log" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029689 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon-log" Mar 18 17:19:33 crc kubenswrapper[4939]: E0318 17:19:33.029704 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="registry-server" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029709 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="registry-server" Mar 18 17:19:33 crc kubenswrapper[4939]: E0318 17:19:33.029722 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029727 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029918 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029929 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a17b2b5-7a9c-4c4d-a9e2-9df75c82826c" containerName="registry-server" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.029944 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b200d14-46d6-42bc-a168-55f2e409d7ac" containerName="horizon-log" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.030946 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.048141 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d866dd55-p86pc"] Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.100748 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-scripts\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.100848 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12b67c3c-da84-435f-b25e-f45575becb1b-logs\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.100879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-config-data\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.101207 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12b67c3c-da84-435f-b25e-f45575becb1b-horizon-secret-key\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.101300 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjs98\" (UniqueName: \"kubernetes.io/projected/12b67c3c-da84-435f-b25e-f45575becb1b-kube-api-access-zjs98\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.203215 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-scripts\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.203296 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12b67c3c-da84-435f-b25e-f45575becb1b-logs\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.203323 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-config-data\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.203449 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12b67c3c-da84-435f-b25e-f45575becb1b-horizon-secret-key\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.203467 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjs98\" (UniqueName: \"kubernetes.io/projected/12b67c3c-da84-435f-b25e-f45575becb1b-kube-api-access-zjs98\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.204327 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12b67c3c-da84-435f-b25e-f45575becb1b-logs\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.205018 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-scripts\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.205144 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b67c3c-da84-435f-b25e-f45575becb1b-config-data\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.209518 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12b67c3c-da84-435f-b25e-f45575becb1b-horizon-secret-key\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.221273 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjs98\" (UniqueName: \"kubernetes.io/projected/12b67c3c-da84-435f-b25e-f45575becb1b-kube-api-access-zjs98\") pod \"horizon-d866dd55-p86pc\" (UID: \"12b67c3c-da84-435f-b25e-f45575becb1b\") " pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.350133 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.842680 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d866dd55-p86pc"] Mar 18 17:19:33 crc kubenswrapper[4939]: I0318 17:19:33.981172 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d866dd55-p86pc" event={"ID":"12b67c3c-da84-435f-b25e-f45575becb1b","Type":"ContainerStarted","Data":"a343e7213eaf9d628e9f49e4f180fe43e2fc56ceec211e2e351ae3bd7292fe57"} Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.147306 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-hp82l"] Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.148534 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-hp82l"] Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.148627 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.224253 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.224843 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwn86\" (UniqueName: \"kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.233826 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-5ee2-account-create-update-xrjf5"] Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.235649 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.241804 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.244910 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5ee2-account-create-update-xrjf5"] Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.326545 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhcw\" (UniqueName: \"kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.326906 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwn86\" (UniqueName: \"kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.327359 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.327525 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.328279 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.360638 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwn86\" (UniqueName: \"kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86\") pod \"heat-db-create-hp82l\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.430531 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.430688 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhcw\" (UniqueName: \"kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.431599 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.449005 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhcw\" (UniqueName: \"kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw\") pod \"heat-5ee2-account-create-update-xrjf5\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.473942 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hp82l" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.578889 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.945928 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-hp82l"] Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.992443 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d866dd55-p86pc" event={"ID":"12b67c3c-da84-435f-b25e-f45575becb1b","Type":"ContainerStarted","Data":"240b30763276fb4cd82a64bc9b0012587361ac91db3c73e45d815f859425589c"} Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.992546 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d866dd55-p86pc" event={"ID":"12b67c3c-da84-435f-b25e-f45575becb1b","Type":"ContainerStarted","Data":"f01adc8156726653fa43e2abdf728d6578711d79715f01487a20f6297854cfee"} Mar 18 17:19:34 crc kubenswrapper[4939]: I0318 17:19:34.994836 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hp82l" event={"ID":"d4e24266-e4ab-4f96-8749-8fcc70366102","Type":"ContainerStarted","Data":"dce6b422ab6a590d129383033a2b978a7544d90740270977802fc90fba27d365"} Mar 18 17:19:35 crc kubenswrapper[4939]: I0318 17:19:35.029304 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-d866dd55-p86pc" podStartSLOduration=2.029279276 podStartE2EDuration="2.029279276s" podCreationTimestamp="2026-03-18 17:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:19:35.024816629 +0000 UTC m=+6139.624004260" watchObservedRunningTime="2026-03-18 17:19:35.029279276 +0000 UTC m=+6139.628466897" Mar 18 17:19:35 crc kubenswrapper[4939]: I0318 17:19:35.078663 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5ee2-account-create-update-xrjf5"] Mar 18 17:19:36 crc kubenswrapper[4939]: I0318 17:19:36.016211 4939 generic.go:334] "Generic (PLEG): container finished" podID="4b4786eb-9145-45be-bcd2-4771ce87936e" containerID="39150643f63c7adf6f50ecac7c87ed326d9b359fc2760be370790a02e634573f" exitCode=0 Mar 18 17:19:36 crc kubenswrapper[4939]: I0318 17:19:36.016363 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5ee2-account-create-update-xrjf5" event={"ID":"4b4786eb-9145-45be-bcd2-4771ce87936e","Type":"ContainerDied","Data":"39150643f63c7adf6f50ecac7c87ed326d9b359fc2760be370790a02e634573f"} Mar 18 17:19:36 crc kubenswrapper[4939]: I0318 17:19:36.016521 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5ee2-account-create-update-xrjf5" event={"ID":"4b4786eb-9145-45be-bcd2-4771ce87936e","Type":"ContainerStarted","Data":"05874b8727238726b90c4cbba828c32131fa0fecdd40afa6530220143fc7904f"} Mar 18 17:19:36 crc kubenswrapper[4939]: I0318 17:19:36.019811 4939 generic.go:334] "Generic (PLEG): container finished" podID="d4e24266-e4ab-4f96-8749-8fcc70366102" containerID="4dc4b2cb200986b8bad6585d1f4f7611a450ac3e090271a684a92aa884828d14" exitCode=0 Mar 18 17:19:36 crc kubenswrapper[4939]: I0318 17:19:36.020144 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hp82l" event={"ID":"d4e24266-e4ab-4f96-8749-8fcc70366102","Type":"ContainerDied","Data":"4dc4b2cb200986b8bad6585d1f4f7611a450ac3e090271a684a92aa884828d14"} Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.135807 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:19:37 crc kubenswrapper[4939]: E0318 17:19:37.136189 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.472345 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.478796 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hp82l" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.518782 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts\") pod \"d4e24266-e4ab-4f96-8749-8fcc70366102\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.518935 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts\") pod \"4b4786eb-9145-45be-bcd2-4771ce87936e\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.519020 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvhcw\" (UniqueName: \"kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw\") pod \"4b4786eb-9145-45be-bcd2-4771ce87936e\" (UID: \"4b4786eb-9145-45be-bcd2-4771ce87936e\") " Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.519090 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwn86\" (UniqueName: \"kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86\") pod \"d4e24266-e4ab-4f96-8749-8fcc70366102\" (UID: \"d4e24266-e4ab-4f96-8749-8fcc70366102\") " Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.519465 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b4786eb-9145-45be-bcd2-4771ce87936e" (UID: "4b4786eb-9145-45be-bcd2-4771ce87936e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.521261 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b4786eb-9145-45be-bcd2-4771ce87936e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.524589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4e24266-e4ab-4f96-8749-8fcc70366102" (UID: "d4e24266-e4ab-4f96-8749-8fcc70366102"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.529741 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw" (OuterVolumeSpecName: "kube-api-access-zvhcw") pod "4b4786eb-9145-45be-bcd2-4771ce87936e" (UID: "4b4786eb-9145-45be-bcd2-4771ce87936e"). InnerVolumeSpecName "kube-api-access-zvhcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.530229 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86" (OuterVolumeSpecName: "kube-api-access-fwn86") pod "d4e24266-e4ab-4f96-8749-8fcc70366102" (UID: "d4e24266-e4ab-4f96-8749-8fcc70366102"). InnerVolumeSpecName "kube-api-access-fwn86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.623044 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4e24266-e4ab-4f96-8749-8fcc70366102-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.623078 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvhcw\" (UniqueName: \"kubernetes.io/projected/4b4786eb-9145-45be-bcd2-4771ce87936e-kube-api-access-zvhcw\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:37 crc kubenswrapper[4939]: I0318 17:19:37.623097 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwn86\" (UniqueName: \"kubernetes.io/projected/d4e24266-e4ab-4f96-8749-8fcc70366102-kube-api-access-fwn86\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.049415 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5ee2-account-create-update-xrjf5" Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.049433 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5ee2-account-create-update-xrjf5" event={"ID":"4b4786eb-9145-45be-bcd2-4771ce87936e","Type":"ContainerDied","Data":"05874b8727238726b90c4cbba828c32131fa0fecdd40afa6530220143fc7904f"} Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.049492 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05874b8727238726b90c4cbba828c32131fa0fecdd40afa6530220143fc7904f" Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.052837 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hp82l" event={"ID":"d4e24266-e4ab-4f96-8749-8fcc70366102","Type":"ContainerDied","Data":"dce6b422ab6a590d129383033a2b978a7544d90740270977802fc90fba27d365"} Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.052879 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce6b422ab6a590d129383033a2b978a7544d90740270977802fc90fba27d365" Mar 18 17:19:38 crc kubenswrapper[4939]: I0318 17:19:38.052918 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hp82l" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.346257 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-8tp6q"] Mar 18 17:19:39 crc kubenswrapper[4939]: E0318 17:19:39.347291 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4786eb-9145-45be-bcd2-4771ce87936e" containerName="mariadb-account-create-update" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.347313 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4786eb-9145-45be-bcd2-4771ce87936e" containerName="mariadb-account-create-update" Mar 18 17:19:39 crc kubenswrapper[4939]: E0318 17:19:39.347353 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e24266-e4ab-4f96-8749-8fcc70366102" containerName="mariadb-database-create" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.347365 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e24266-e4ab-4f96-8749-8fcc70366102" containerName="mariadb-database-create" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.347730 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4786eb-9145-45be-bcd2-4771ce87936e" containerName="mariadb-account-create-update" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.347754 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e24266-e4ab-4f96-8749-8fcc70366102" containerName="mariadb-database-create" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.348947 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.353261 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-lwhrj" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.353291 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.353963 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9m4\" (UniqueName: \"kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.354002 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.354068 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.363152 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8tp6q"] Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.457293 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.458061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9m4\" (UniqueName: \"kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.458176 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.464793 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.464832 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.482462 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9m4\" (UniqueName: \"kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4\") pod \"heat-db-sync-8tp6q\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:39 crc kubenswrapper[4939]: I0318 17:19:39.679066 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:40 crc kubenswrapper[4939]: I0318 17:19:40.196416 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8tp6q"] Mar 18 17:19:41 crc kubenswrapper[4939]: I0318 17:19:41.086627 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8tp6q" event={"ID":"a209284b-60ea-46b1-844d-e757771e0567","Type":"ContainerStarted","Data":"4e76745fb9b9dc2dc57d1a4422f69d594a0f6b4d6f7d46b4ca4810bb970e6f74"} Mar 18 17:19:43 crc kubenswrapper[4939]: I0318 17:19:43.350418 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:43 crc kubenswrapper[4939]: I0318 17:19:43.351615 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:47 crc kubenswrapper[4939]: I0318 17:19:47.167420 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8tp6q" event={"ID":"a209284b-60ea-46b1-844d-e757771e0567","Type":"ContainerStarted","Data":"a6982fe013da4425b9ee6a75b0bfa75b90b4f0be0766fdccd67e91f0b6ea13f7"} Mar 18 17:19:47 crc kubenswrapper[4939]: I0318 17:19:47.198164 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-8tp6q" podStartSLOduration=1.929671705 podStartE2EDuration="8.198144069s" podCreationTimestamp="2026-03-18 17:19:39 +0000 UTC" firstStartedPulling="2026-03-18 17:19:40.197491505 +0000 UTC m=+6144.796679126" lastFinishedPulling="2026-03-18 17:19:46.465963859 +0000 UTC m=+6151.065151490" observedRunningTime="2026-03-18 17:19:47.182562617 +0000 UTC m=+6151.781750248" watchObservedRunningTime="2026-03-18 17:19:47.198144069 +0000 UTC m=+6151.797331700" Mar 18 17:19:49 crc kubenswrapper[4939]: I0318 17:19:49.195014 4939 generic.go:334] "Generic (PLEG): container finished" podID="a209284b-60ea-46b1-844d-e757771e0567" containerID="a6982fe013da4425b9ee6a75b0bfa75b90b4f0be0766fdccd67e91f0b6ea13f7" exitCode=0 Mar 18 17:19:49 crc kubenswrapper[4939]: I0318 17:19:49.195115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8tp6q" event={"ID":"a209284b-60ea-46b1-844d-e757771e0567","Type":"ContainerDied","Data":"a6982fe013da4425b9ee6a75b0bfa75b90b4f0be0766fdccd67e91f0b6ea13f7"} Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.509681 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.706251 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle\") pod \"a209284b-60ea-46b1-844d-e757771e0567\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.706573 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data\") pod \"a209284b-60ea-46b1-844d-e757771e0567\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.706632 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9m4\" (UniqueName: \"kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4\") pod \"a209284b-60ea-46b1-844d-e757771e0567\" (UID: \"a209284b-60ea-46b1-844d-e757771e0567\") " Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.712905 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4" (OuterVolumeSpecName: "kube-api-access-6r9m4") pod "a209284b-60ea-46b1-844d-e757771e0567" (UID: "a209284b-60ea-46b1-844d-e757771e0567"). InnerVolumeSpecName "kube-api-access-6r9m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.748793 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a209284b-60ea-46b1-844d-e757771e0567" (UID: "a209284b-60ea-46b1-844d-e757771e0567"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.808255 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data" (OuterVolumeSpecName: "config-data") pod "a209284b-60ea-46b1-844d-e757771e0567" (UID: "a209284b-60ea-46b1-844d-e757771e0567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.809474 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r9m4\" (UniqueName: \"kubernetes.io/projected/a209284b-60ea-46b1-844d-e757771e0567-kube-api-access-6r9m4\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.809524 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:50 crc kubenswrapper[4939]: I0318 17:19:50.809538 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a209284b-60ea-46b1-844d-e757771e0567-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:51 crc kubenswrapper[4939]: I0318 17:19:51.249722 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8tp6q" event={"ID":"a209284b-60ea-46b1-844d-e757771e0567","Type":"ContainerDied","Data":"4e76745fb9b9dc2dc57d1a4422f69d594a0f6b4d6f7d46b4ca4810bb970e6f74"} Mar 18 17:19:51 crc kubenswrapper[4939]: I0318 17:19:51.250074 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e76745fb9b9dc2dc57d1a4422f69d594a0f6b4d6f7d46b4ca4810bb970e6f74" Mar 18 17:19:51 crc kubenswrapper[4939]: I0318 17:19:51.249776 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8tp6q" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.134858 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:19:52 crc kubenswrapper[4939]: E0318 17:19:52.135386 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.383458 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-684bfd6b7d-5ncdg"] Mar 18 17:19:52 crc kubenswrapper[4939]: E0318 17:19:52.385367 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a209284b-60ea-46b1-844d-e757771e0567" containerName="heat-db-sync" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.385643 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a209284b-60ea-46b1-844d-e757771e0567" containerName="heat-db-sync" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.385930 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a209284b-60ea-46b1-844d-e757771e0567" containerName="heat-db-sync" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.386701 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.397690 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.403085 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-lwhrj" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.403355 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.454156 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-684bfd6b7d-5ncdg"] Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.475332 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7dfbc5946-xgvcs"] Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.477050 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.480836 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.484401 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dfbc5946-xgvcs"] Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.542633 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6fd9d6d4d6-crbpz"] Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.544098 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.550081 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.558269 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd9d6d4d6-crbpz"] Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.559586 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8zfs\" (UniqueName: \"kubernetes.io/projected/619ea53f-adf0-4b62-8070-d26ce9739a92-kube-api-access-k8zfs\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.559674 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-combined-ca-bundle\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.559800 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.559931 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data-custom\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661171 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8zfs\" (UniqueName: \"kubernetes.io/projected/619ea53f-adf0-4b62-8070-d26ce9739a92-kube-api-access-k8zfs\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661216 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-combined-ca-bundle\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661284 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data-custom\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661329 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9qx\" (UniqueName: \"kubernetes.io/projected/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-kube-api-access-8x9qx\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661355 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-combined-ca-bundle\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661382 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/dd05e31f-c125-4408-88dd-c95a60e1e05d-kube-api-access-t8lt2\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661401 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661427 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661466 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data-custom\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661493 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-combined-ca-bundle\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.661530 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data-custom\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.667881 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.679357 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-config-data-custom\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.680262 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619ea53f-adf0-4b62-8070-d26ce9739a92-combined-ca-bundle\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.685807 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8zfs\" (UniqueName: \"kubernetes.io/projected/619ea53f-adf0-4b62-8070-d26ce9739a92-kube-api-access-k8zfs\") pod \"heat-engine-684bfd6b7d-5ncdg\" (UID: \"619ea53f-adf0-4b62-8070-d26ce9739a92\") " pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.735699 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-combined-ca-bundle\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763717 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763796 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data-custom\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763831 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9qx\" (UniqueName: \"kubernetes.io/projected/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-kube-api-access-8x9qx\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763886 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-combined-ca-bundle\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763928 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/dd05e31f-c125-4408-88dd-c95a60e1e05d-kube-api-access-t8lt2\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.763992 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.764065 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data-custom\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.770913 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-combined-ca-bundle\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.771344 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-combined-ca-bundle\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.771804 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data-custom\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.774035 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data-custom\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.776297 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05e31f-c125-4408-88dd-c95a60e1e05d-config-data\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.788520 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-config-data\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.790177 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9qx\" (UniqueName: \"kubernetes.io/projected/a5be3887-44b0-45e5-ab55-8b70aba4b0bb-kube-api-access-8x9qx\") pod \"heat-cfnapi-7dfbc5946-xgvcs\" (UID: \"a5be3887-44b0-45e5-ab55-8b70aba4b0bb\") " pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.798459 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.807188 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/dd05e31f-c125-4408-88dd-c95a60e1e05d-kube-api-access-t8lt2\") pod \"heat-api-6fd9d6d4d6-crbpz\" (UID: \"dd05e31f-c125-4408-88dd-c95a60e1e05d\") " pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:52 crc kubenswrapper[4939]: I0318 17:19:52.866994 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:53 crc kubenswrapper[4939]: I0318 17:19:53.297532 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-684bfd6b7d-5ncdg"] Mar 18 17:19:53 crc kubenswrapper[4939]: I0318 17:19:53.379364 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dfbc5946-xgvcs"] Mar 18 17:19:53 crc kubenswrapper[4939]: W0318 17:19:53.385791 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5be3887_44b0_45e5_ab55_8b70aba4b0bb.slice/crio-2d3060b1accf49d0e5a9d66c9414aef5d380b0cd4563f9461715286a9fd2468d WatchSource:0}: Error finding container 2d3060b1accf49d0e5a9d66c9414aef5d380b0cd4563f9461715286a9fd2468d: Status 404 returned error can't find the container with id 2d3060b1accf49d0e5a9d66c9414aef5d380b0cd4563f9461715286a9fd2468d Mar 18 17:19:53 crc kubenswrapper[4939]: I0318 17:19:53.479754 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd9d6d4d6-crbpz"] Mar 18 17:19:54 crc kubenswrapper[4939]: I0318 17:19:54.284688 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" event={"ID":"a5be3887-44b0-45e5-ab55-8b70aba4b0bb","Type":"ContainerStarted","Data":"2d3060b1accf49d0e5a9d66c9414aef5d380b0cd4563f9461715286a9fd2468d"} Mar 18 17:19:54 crc kubenswrapper[4939]: I0318 17:19:54.286969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd9d6d4d6-crbpz" event={"ID":"dd05e31f-c125-4408-88dd-c95a60e1e05d","Type":"ContainerStarted","Data":"839b7477be971d4b4240fa2584d3b8ee47ea3850f8e2419108631a42de6af5a7"} Mar 18 17:19:54 crc kubenswrapper[4939]: I0318 17:19:54.288996 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684bfd6b7d-5ncdg" event={"ID":"619ea53f-adf0-4b62-8070-d26ce9739a92","Type":"ContainerStarted","Data":"ebd5bbb8702b83c6cffef2b37328c9cdc0030303e61b69746800b859961ba6df"} Mar 18 17:19:54 crc kubenswrapper[4939]: I0318 17:19:54.289035 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-684bfd6b7d-5ncdg" event={"ID":"619ea53f-adf0-4b62-8070-d26ce9739a92","Type":"ContainerStarted","Data":"ec282cbeced62e941d466c76e58a720bfcf667f9ecd09f6ff39e28888dd0fac1"} Mar 18 17:19:54 crc kubenswrapper[4939]: I0318 17:19:54.289164 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:19:55 crc kubenswrapper[4939]: I0318 17:19:55.120117 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:55 crc kubenswrapper[4939]: I0318 17:19:55.142199 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-684bfd6b7d-5ncdg" podStartSLOduration=3.142169133 podStartE2EDuration="3.142169133s" podCreationTimestamp="2026-03-18 17:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:19:54.307848471 +0000 UTC m=+6158.907036102" watchObservedRunningTime="2026-03-18 17:19:55.142169133 +0000 UTC m=+6159.741356754" Mar 18 17:19:56 crc kubenswrapper[4939]: I0318 17:19:56.307328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" event={"ID":"a5be3887-44b0-45e5-ab55-8b70aba4b0bb","Type":"ContainerStarted","Data":"0f53010b7ffdfff5d4bfdb450568bac051f3ee3131364283b79748038dcbf814"} Mar 18 17:19:56 crc kubenswrapper[4939]: I0318 17:19:56.307678 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:19:56 crc kubenswrapper[4939]: I0318 17:19:56.308874 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd9d6d4d6-crbpz" event={"ID":"dd05e31f-c125-4408-88dd-c95a60e1e05d","Type":"ContainerStarted","Data":"115cc45d2eab881d1ea2a29464379ef5a0c1e6bfa77817d2e3775f21ff3b96b3"} Mar 18 17:19:56 crc kubenswrapper[4939]: I0318 17:19:56.309019 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:19:56 crc kubenswrapper[4939]: I0318 17:19:56.396132 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" podStartSLOduration=2.278425357 podStartE2EDuration="4.396116511s" podCreationTimestamp="2026-03-18 17:19:52 +0000 UTC" firstStartedPulling="2026-03-18 17:19:53.388205757 +0000 UTC m=+6157.987393378" lastFinishedPulling="2026-03-18 17:19:55.505896911 +0000 UTC m=+6160.105084532" observedRunningTime="2026-03-18 17:19:56.366998794 +0000 UTC m=+6160.966186415" watchObservedRunningTime="2026-03-18 17:19:56.396116511 +0000 UTC m=+6160.995304132" Mar 18 17:19:57 crc kubenswrapper[4939]: I0318 17:19:57.471856 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-d866dd55-p86pc" Mar 18 17:19:57 crc kubenswrapper[4939]: I0318 17:19:57.502061 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6fd9d6d4d6-crbpz" podStartSLOduration=3.482816456 podStartE2EDuration="5.502041655s" podCreationTimestamp="2026-03-18 17:19:52 +0000 UTC" firstStartedPulling="2026-03-18 17:19:53.488894896 +0000 UTC m=+6158.088082517" lastFinishedPulling="2026-03-18 17:19:55.508120095 +0000 UTC m=+6160.107307716" observedRunningTime="2026-03-18 17:19:56.393374503 +0000 UTC m=+6160.992562134" watchObservedRunningTime="2026-03-18 17:19:57.502041655 +0000 UTC m=+6162.101229276" Mar 18 17:19:57 crc kubenswrapper[4939]: I0318 17:19:57.553611 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:19:57 crc kubenswrapper[4939]: I0318 17:19:57.554199 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon-log" containerID="cri-o://3312e48ecb177192bacdba1fad46a00504efcb1dd4373992f48f8e74dd64eb89" gracePeriod=30 Mar 18 17:19:57 crc kubenswrapper[4939]: I0318 17:19:57.554280 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" containerID="cri-o://28d17527b3553e326ae4904ca87a1278ee7520f88c31d66d15698c2c6d5204d5" gracePeriod=30 Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.177663 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564240-vwblv"] Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.180069 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.189892 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.189936 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.190183 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.195233 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-vwblv"] Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.344803 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6966v\" (UniqueName: \"kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v\") pod \"auto-csr-approver-29564240-vwblv\" (UID: \"473055c8-6c3b-4f69-bac4-51661019986c\") " pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.448604 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6966v\" (UniqueName: \"kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v\") pod \"auto-csr-approver-29564240-vwblv\" (UID: \"473055c8-6c3b-4f69-bac4-51661019986c\") " pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.477370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6966v\" (UniqueName: \"kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v\") pod \"auto-csr-approver-29564240-vwblv\" (UID: \"473055c8-6c3b-4f69-bac4-51661019986c\") " pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:00 crc kubenswrapper[4939]: I0318 17:20:00.518853 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:01 crc kubenswrapper[4939]: I0318 17:20:01.039012 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-vwblv"] Mar 18 17:20:01 crc kubenswrapper[4939]: I0318 17:20:01.357965 4939 generic.go:334] "Generic (PLEG): container finished" podID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerID="28d17527b3553e326ae4904ca87a1278ee7520f88c31d66d15698c2c6d5204d5" exitCode=0 Mar 18 17:20:01 crc kubenswrapper[4939]: I0318 17:20:01.358046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerDied","Data":"28d17527b3553e326ae4904ca87a1278ee7520f88c31d66d15698c2c6d5204d5"} Mar 18 17:20:01 crc kubenswrapper[4939]: I0318 17:20:01.360537 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-vwblv" event={"ID":"473055c8-6c3b-4f69-bac4-51661019986c","Type":"ContainerStarted","Data":"cc4019af88b338c4484f362d68d655fd15ebc978f231d10d1cd7bf0bda616b50"} Mar 18 17:20:03 crc kubenswrapper[4939]: I0318 17:20:03.134191 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:20:03 crc kubenswrapper[4939]: E0318 17:20:03.135038 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:20:03 crc kubenswrapper[4939]: I0318 17:20:03.397980 4939 generic.go:334] "Generic (PLEG): container finished" podID="473055c8-6c3b-4f69-bac4-51661019986c" containerID="f58d00469d2c3254d3adf80a8137a02f2ae26cfb452fea7f2f7f608af908ab80" exitCode=0 Mar 18 17:20:03 crc kubenswrapper[4939]: I0318 17:20:03.398405 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-vwblv" event={"ID":"473055c8-6c3b-4f69-bac4-51661019986c","Type":"ContainerDied","Data":"f58d00469d2c3254d3adf80a8137a02f2ae26cfb452fea7f2f7f608af908ab80"} Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.205061 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7dfbc5946-xgvcs" Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.221469 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6fd9d6d4d6-crbpz" Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.665656 4939 scope.go:117] "RemoveContainer" containerID="9868d634f98d06c0da552b82eccc0186d5727b6173d485d382ba024b5b2274fc" Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.750421 4939 scope.go:117] "RemoveContainer" containerID="43a9b19a8415798264b80b3011b291ee0f50c31f66ac74c7ba12c96bb6f3e3ac" Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.780687 4939 scope.go:117] "RemoveContainer" containerID="371859a9a1978286ff7b4748b354e0da4888881a4cf487f9d10229626d6ffe77" Mar 18 17:20:04 crc kubenswrapper[4939]: I0318 17:20:04.915680 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.051838 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6966v\" (UniqueName: \"kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v\") pod \"473055c8-6c3b-4f69-bac4-51661019986c\" (UID: \"473055c8-6c3b-4f69-bac4-51661019986c\") " Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.058722 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v" (OuterVolumeSpecName: "kube-api-access-6966v") pod "473055c8-6c3b-4f69-bac4-51661019986c" (UID: "473055c8-6c3b-4f69-bac4-51661019986c"). InnerVolumeSpecName "kube-api-access-6966v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.154577 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6966v\" (UniqueName: \"kubernetes.io/projected/473055c8-6c3b-4f69-bac4-51661019986c-kube-api-access-6966v\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.419338 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-vwblv" event={"ID":"473055c8-6c3b-4f69-bac4-51661019986c","Type":"ContainerDied","Data":"cc4019af88b338c4484f362d68d655fd15ebc978f231d10d1cd7bf0bda616b50"} Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.419376 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc4019af88b338c4484f362d68d655fd15ebc978f231d10d1cd7bf0bda616b50" Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.419426 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-vwblv" Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.977836 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-zs8gk"] Mar 18 17:20:05 crc kubenswrapper[4939]: I0318 17:20:05.987305 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-zs8gk"] Mar 18 17:20:06 crc kubenswrapper[4939]: I0318 17:20:06.145970 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80da288e-4fb5-4729-a51d-4f15c5a3b25e" path="/var/lib/kubelet/pods/80da288e-4fb5-4729-a51d-4f15c5a3b25e/volumes" Mar 18 17:20:07 crc kubenswrapper[4939]: I0318 17:20:07.071989 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.155:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.155:8080: connect: connection refused" Mar 18 17:20:09 crc kubenswrapper[4939]: I0318 17:20:09.038313 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-77de-account-create-update-67wfn"] Mar 18 17:20:09 crc kubenswrapper[4939]: I0318 17:20:09.052486 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-77de-account-create-update-67wfn"] Mar 18 17:20:09 crc kubenswrapper[4939]: I0318 17:20:09.060675 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xkw2j"] Mar 18 17:20:09 crc kubenswrapper[4939]: I0318 17:20:09.069268 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xkw2j"] Mar 18 17:20:10 crc kubenswrapper[4939]: I0318 17:20:10.148247 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0da004-51dd-4cd0-85a2-318e5ecf9f26" path="/var/lib/kubelet/pods/2a0da004-51dd-4cd0-85a2-318e5ecf9f26/volumes" Mar 18 17:20:10 crc kubenswrapper[4939]: I0318 17:20:10.149690 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5880775f-1842-448f-b622-2805f5ac64f8" path="/var/lib/kubelet/pods/5880775f-1842-448f-b622-2805f5ac64f8/volumes" Mar 18 17:20:12 crc kubenswrapper[4939]: I0318 17:20:12.773303 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-684bfd6b7d-5ncdg" Mar 18 17:20:17 crc kubenswrapper[4939]: I0318 17:20:17.072335 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.155:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.155:8080: connect: connection refused" Mar 18 17:20:18 crc kubenswrapper[4939]: I0318 17:20:18.035471 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8h7zl"] Mar 18 17:20:18 crc kubenswrapper[4939]: I0318 17:20:18.043512 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8h7zl"] Mar 18 17:20:18 crc kubenswrapper[4939]: I0318 17:20:18.133102 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:20:18 crc kubenswrapper[4939]: E0318 17:20:18.133391 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:20:18 crc kubenswrapper[4939]: I0318 17:20:18.148273 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a5ef95-f146-4eec-ba54-12a925005e5f" path="/var/lib/kubelet/pods/38a5ef95-f146-4eec-ba54-12a925005e5f/volumes" Mar 18 17:20:27 crc kubenswrapper[4939]: I0318 17:20:27.073108 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-868b5b4f6c-vw95m" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.155:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.155:8080: connect: connection refused" Mar 18 17:20:27 crc kubenswrapper[4939]: I0318 17:20:27.073873 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:20:27 crc kubenswrapper[4939]: W0318 17:20:27.614813 4939 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491d1da7_8a7f_4c7f_9f4c_8aa5adef37d3.slice/crio-554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491d1da7_8a7f_4c7f_9f4c_8aa5adef37d3.slice/crio-554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491d1da7_8a7f_4c7f_9f4c_8aa5adef37d3.slice/crio-554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662/memory.stat: no such device], continuing to push stats Mar 18 17:20:27 crc kubenswrapper[4939]: I0318 17:20:27.665900 4939 generic.go:334] "Generic (PLEG): container finished" podID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerID="3312e48ecb177192bacdba1fad46a00504efcb1dd4373992f48f8e74dd64eb89" exitCode=137 Mar 18 17:20:27 crc kubenswrapper[4939]: I0318 17:20:27.665982 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerDied","Data":"3312e48ecb177192bacdba1fad46a00504efcb1dd4373992f48f8e74dd64eb89"} Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.043397 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.219017 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts\") pod \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.219124 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs\") pod \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.219185 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data\") pod \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.219329 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key\") pod \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.219359 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvrj\" (UniqueName: \"kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj\") pod \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\" (UID: \"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3\") " Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.221740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs" (OuterVolumeSpecName: "logs") pod "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" (UID: "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.226906 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" (UID: "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.227128 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj" (OuterVolumeSpecName: "kube-api-access-4jvrj") pod "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" (UID: "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3"). InnerVolumeSpecName "kube-api-access-4jvrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.253868 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts" (OuterVolumeSpecName: "scripts") pod "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" (UID: "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.260530 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data" (OuterVolumeSpecName: "config-data") pod "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" (UID: "491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.322120 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.322153 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.322162 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.322173 4939 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.322183 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvrj\" (UniqueName: \"kubernetes.io/projected/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3-kube-api-access-4jvrj\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.677346 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868b5b4f6c-vw95m" event={"ID":"491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3","Type":"ContainerDied","Data":"554d32a34b7314b1a984342f9b30e344f87784d81c264f5f4fc8f2fcbf923662"} Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.677408 4939 scope.go:117] "RemoveContainer" containerID="28d17527b3553e326ae4904ca87a1278ee7520f88c31d66d15698c2c6d5204d5" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.677411 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868b5b4f6c-vw95m" Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.722284 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.739110 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-868b5b4f6c-vw95m"] Mar 18 17:20:28 crc kubenswrapper[4939]: I0318 17:20:28.851816 4939 scope.go:117] "RemoveContainer" containerID="3312e48ecb177192bacdba1fad46a00504efcb1dd4373992f48f8e74dd64eb89" Mar 18 17:20:30 crc kubenswrapper[4939]: I0318 17:20:30.133709 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:20:30 crc kubenswrapper[4939]: E0318 17:20:30.134279 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:20:30 crc kubenswrapper[4939]: I0318 17:20:30.166562 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" path="/var/lib/kubelet/pods/491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3/volumes" Mar 18 17:20:45 crc kubenswrapper[4939]: I0318 17:20:45.133002 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:20:45 crc kubenswrapper[4939]: E0318 17:20:45.133630 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:20:49 crc kubenswrapper[4939]: I0318 17:20:49.061625 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8fbbj"] Mar 18 17:20:49 crc kubenswrapper[4939]: I0318 17:20:49.071577 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fba0-account-create-update-824gc"] Mar 18 17:20:49 crc kubenswrapper[4939]: I0318 17:20:49.084795 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fba0-account-create-update-824gc"] Mar 18 17:20:49 crc kubenswrapper[4939]: I0318 17:20:49.095333 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8fbbj"] Mar 18 17:20:50 crc kubenswrapper[4939]: I0318 17:20:50.144421 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323a9023-e60c-4234-9691-3dc7cde07c64" path="/var/lib/kubelet/pods/323a9023-e60c-4234-9691-3dc7cde07c64/volumes" Mar 18 17:20:50 crc kubenswrapper[4939]: I0318 17:20:50.145157 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0e0d64-19f5-4bce-abb1-8b6344d04681" path="/var/lib/kubelet/pods/de0e0d64-19f5-4bce-abb1-8b6344d04681/volumes" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.598314 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k"] Mar 18 17:20:51 crc kubenswrapper[4939]: E0318 17:20:51.598920 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473055c8-6c3b-4f69-bac4-51661019986c" containerName="oc" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.598940 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="473055c8-6c3b-4f69-bac4-51661019986c" containerName="oc" Mar 18 17:20:51 crc kubenswrapper[4939]: E0318 17:20:51.598967 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.598976 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" Mar 18 17:20:51 crc kubenswrapper[4939]: E0318 17:20:51.598991 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon-log" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.598999 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon-log" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.599238 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.599272 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="473055c8-6c3b-4f69-bac4-51661019986c" containerName="oc" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.599290 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="491d1da7-8a7f-4c7f-9f4c-8aa5adef37d3" containerName="horizon-log" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.601149 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.603179 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.610244 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k"] Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.667886 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.668191 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.668407 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6l55\" (UniqueName: \"kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.771716 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.771789 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.771933 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6l55\" (UniqueName: \"kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.772226 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.772462 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.796729 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6l55\" (UniqueName: \"kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:51 crc kubenswrapper[4939]: I0318 17:20:51.936029 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:52 crc kubenswrapper[4939]: I0318 17:20:52.535798 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k"] Mar 18 17:20:52 crc kubenswrapper[4939]: I0318 17:20:52.924758 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerStarted","Data":"1e84baab11c103148feb79ece39f4ea84cc4ed7a1bc894a7efd8d3a59265056c"} Mar 18 17:20:52 crc kubenswrapper[4939]: I0318 17:20:52.924812 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerStarted","Data":"9fe154ae28ea7b6ff7e0db439649b191b8922cc5252197fff430e9619f1f9a6a"} Mar 18 17:20:53 crc kubenswrapper[4939]: I0318 17:20:53.935080 4939 generic.go:334] "Generic (PLEG): container finished" podID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerID="1e84baab11c103148feb79ece39f4ea84cc4ed7a1bc894a7efd8d3a59265056c" exitCode=0 Mar 18 17:20:53 crc kubenswrapper[4939]: I0318 17:20:53.935194 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerDied","Data":"1e84baab11c103148feb79ece39f4ea84cc4ed7a1bc894a7efd8d3a59265056c"} Mar 18 17:20:55 crc kubenswrapper[4939]: I0318 17:20:55.066613 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-znt4w"] Mar 18 17:20:55 crc kubenswrapper[4939]: I0318 17:20:55.079804 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-znt4w"] Mar 18 17:20:56 crc kubenswrapper[4939]: I0318 17:20:56.144556 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1852a30-ade7-4a7a-a96c-a8098b875b30" path="/var/lib/kubelet/pods/c1852a30-ade7-4a7a-a96c-a8098b875b30/volumes" Mar 18 17:20:56 crc kubenswrapper[4939]: I0318 17:20:56.962054 4939 generic.go:334] "Generic (PLEG): container finished" podID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerID="0939bf0b8998ed3f430a3a63ea3bda93053487495c4bbec6cc28f6728680251e" exitCode=0 Mar 18 17:20:56 crc kubenswrapper[4939]: I0318 17:20:56.962098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerDied","Data":"0939bf0b8998ed3f430a3a63ea3bda93053487495c4bbec6cc28f6728680251e"} Mar 18 17:20:57 crc kubenswrapper[4939]: I0318 17:20:57.943289 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:20:57 crc kubenswrapper[4939]: I0318 17:20:57.945830 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:57 crc kubenswrapper[4939]: I0318 17:20:57.960922 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:20:57 crc kubenswrapper[4939]: I0318 17:20:57.985530 4939 generic.go:334] "Generic (PLEG): container finished" podID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerID="8d233934fb81083088f63ea2675d5a443272614be39f6a63c39c970fad3cef9e" exitCode=0 Mar 18 17:20:57 crc kubenswrapper[4939]: I0318 17:20:57.985594 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerDied","Data":"8d233934fb81083088f63ea2675d5a443272614be39f6a63c39c970fad3cef9e"} Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.002778 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.002978 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.003029 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tqq\" (UniqueName: \"kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.104972 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tqq\" (UniqueName: \"kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.105191 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.105284 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.106183 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.106241 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.133545 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tqq\" (UniqueName: \"kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq\") pod \"redhat-operators-zb4ns\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.134585 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.266914 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.751807 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.996835 4939 generic.go:334] "Generic (PLEG): container finished" podID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerID="950a5b29fef364ab64f2c3af4a3676bae64aba8ef76ad1e2c293d6848c05fe42" exitCode=0 Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.996897 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerDied","Data":"950a5b29fef364ab64f2c3af4a3676bae64aba8ef76ad1e2c293d6848c05fe42"} Mar 18 17:20:58 crc kubenswrapper[4939]: I0318 17:20:58.996924 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerStarted","Data":"08452f235146c6116e2a920ae00b802965f34ed26aa77674710a5e4d4cdb69e0"} Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.002579 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3"} Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.447466 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.562853 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6l55\" (UniqueName: \"kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55\") pod \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.562944 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util\") pod \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.562992 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle\") pod \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\" (UID: \"17abc20c-de7b-4ffd-a986-918dbb8cd4dd\") " Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.565884 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle" (OuterVolumeSpecName: "bundle") pod "17abc20c-de7b-4ffd-a986-918dbb8cd4dd" (UID: "17abc20c-de7b-4ffd-a986-918dbb8cd4dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.569680 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55" (OuterVolumeSpecName: "kube-api-access-f6l55") pod "17abc20c-de7b-4ffd-a986-918dbb8cd4dd" (UID: "17abc20c-de7b-4ffd-a986-918dbb8cd4dd"). InnerVolumeSpecName "kube-api-access-f6l55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.580722 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util" (OuterVolumeSpecName: "util") pod "17abc20c-de7b-4ffd-a986-918dbb8cd4dd" (UID: "17abc20c-de7b-4ffd-a986-918dbb8cd4dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.665652 4939 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-util\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.666028 4939 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:59 crc kubenswrapper[4939]: I0318 17:20:59.666043 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6l55\" (UniqueName: \"kubernetes.io/projected/17abc20c-de7b-4ffd-a986-918dbb8cd4dd-kube-api-access-f6l55\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:00 crc kubenswrapper[4939]: I0318 17:21:00.015556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" event={"ID":"17abc20c-de7b-4ffd-a986-918dbb8cd4dd","Type":"ContainerDied","Data":"9fe154ae28ea7b6ff7e0db439649b191b8922cc5252197fff430e9619f1f9a6a"} Mar 18 17:21:00 crc kubenswrapper[4939]: I0318 17:21:00.015657 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fe154ae28ea7b6ff7e0db439649b191b8922cc5252197fff430e9619f1f9a6a" Mar 18 17:21:00 crc kubenswrapper[4939]: I0318 17:21:00.015595 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k" Mar 18 17:21:01 crc kubenswrapper[4939]: I0318 17:21:01.026351 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerStarted","Data":"a8ad4d48dfb625214f7c07a166efa918eabbdca6eb7cd0dc4e53edc9eb79cb04"} Mar 18 17:21:04 crc kubenswrapper[4939]: I0318 17:21:04.995160 4939 scope.go:117] "RemoveContainer" containerID="6d8dd1ef6381262cca50a5203e5c57d18962649a333bfc06807dcfe57c32c6e3" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.080737 4939 scope.go:117] "RemoveContainer" containerID="b22fa23240c312f5f2d23a5baf604492b3cab3b9ddfaa45e3045b84b3d8ae534" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.137744 4939 scope.go:117] "RemoveContainer" containerID="f4ed2db25ef819027a234186a46e03c0d2ce37164fa1e77a7d79be485ad5d4c5" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.219630 4939 scope.go:117] "RemoveContainer" containerID="b10d742b00b991fc7c5fda16c8488cd4feeaf285bd5bf5551895256cc4ce7394" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.314750 4939 scope.go:117] "RemoveContainer" containerID="7c986c7291a3c277d8a16a0e697f02a66767bf7db9e66e3429c6e3499438f4ad" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.344781 4939 scope.go:117] "RemoveContainer" containerID="8ba491de89683c345bb160686ba7f82be7d6682d7cbbde6b4a07d7ca73dfa37e" Mar 18 17:21:05 crc kubenswrapper[4939]: I0318 17:21:05.402808 4939 scope.go:117] "RemoveContainer" containerID="0551dd988eddc46237cad1fb9190841d60322a8305b60824485b944f3830b0d8" Mar 18 17:21:07 crc kubenswrapper[4939]: I0318 17:21:07.088148 4939 generic.go:334] "Generic (PLEG): container finished" podID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerID="a8ad4d48dfb625214f7c07a166efa918eabbdca6eb7cd0dc4e53edc9eb79cb04" exitCode=0 Mar 18 17:21:07 crc kubenswrapper[4939]: I0318 17:21:07.088223 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerDied","Data":"a8ad4d48dfb625214f7c07a166efa918eabbdca6eb7cd0dc4e53edc9eb79cb04"} Mar 18 17:21:08 crc kubenswrapper[4939]: I0318 17:21:08.099013 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerStarted","Data":"9c6550d30d9f5f4688d76686020cb0dc50ce7b31bfb030da35c26b2b94f8fcbd"} Mar 18 17:21:08 crc kubenswrapper[4939]: I0318 17:21:08.124124 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zb4ns" podStartSLOduration=2.628838224 podStartE2EDuration="11.12410355s" podCreationTimestamp="2026-03-18 17:20:57 +0000 UTC" firstStartedPulling="2026-03-18 17:20:58.998325449 +0000 UTC m=+6223.597513070" lastFinishedPulling="2026-03-18 17:21:07.493590775 +0000 UTC m=+6232.092778396" observedRunningTime="2026-03-18 17:21:08.116848294 +0000 UTC m=+6232.716035915" watchObservedRunningTime="2026-03-18 17:21:08.12410355 +0000 UTC m=+6232.723291171" Mar 18 17:21:08 crc kubenswrapper[4939]: I0318 17:21:08.267357 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:21:08 crc kubenswrapper[4939]: I0318 17:21:08.267424 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:21:09 crc kubenswrapper[4939]: I0318 17:21:09.364532 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:21:09 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:21:09 crc kubenswrapper[4939]: > Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.791071 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv"] Mar 18 17:21:10 crc kubenswrapper[4939]: E0318 17:21:10.791729 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="util" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.791743 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="util" Mar 18 17:21:10 crc kubenswrapper[4939]: E0318 17:21:10.791773 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="extract" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.791788 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="extract" Mar 18 17:21:10 crc kubenswrapper[4939]: E0318 17:21:10.791812 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="pull" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.791818 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="pull" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.792013 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="17abc20c-de7b-4ffd-a986-918dbb8cd4dd" containerName="extract" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.792656 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.799854 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-h9hqw" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.800175 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.799917 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.889579 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9n6\" (UniqueName: \"kubernetes.io/projected/b807b3bb-f1f2-43ad-8a15-359bff856ca7-kube-api-access-md9n6\") pod \"obo-prometheus-operator-8ff7d675-pxtrv\" (UID: \"b807b3bb-f1f2-43ad-8a15-359bff856ca7\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.898048 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv"] Mar 18 17:21:10 crc kubenswrapper[4939]: I0318 17:21:10.991485 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9n6\" (UniqueName: \"kubernetes.io/projected/b807b3bb-f1f2-43ad-8a15-359bff856ca7-kube-api-access-md9n6\") pod \"obo-prometheus-operator-8ff7d675-pxtrv\" (UID: \"b807b3bb-f1f2-43ad-8a15-359bff856ca7\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.029789 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9n6\" (UniqueName: \"kubernetes.io/projected/b807b3bb-f1f2-43ad-8a15-359bff856ca7-kube-api-access-md9n6\") pod \"obo-prometheus-operator-8ff7d675-pxtrv\" (UID: \"b807b3bb-f1f2-43ad-8a15-359bff856ca7\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.085743 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.087056 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.091719 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.094581 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-77t4q" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.107344 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.109147 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.120483 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.124044 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.162342 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.194605 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.194784 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.194807 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.194886 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.296813 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.297116 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.297190 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.297275 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.306941 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.308754 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.364653 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ffde8c6-522e-491d-8f41-848c8a175529-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-dk6hk\" (UID: \"9ffde8c6-522e-491d-8f41-848c8a175529\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.365870 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa00393f-d482-4806-8ec6-e25a9f306888-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-599799b8-zhbrz\" (UID: \"fa00393f-d482-4806-8ec6-e25a9f306888\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.406427 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.435617 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.771885 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-zc9lq"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.786900 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.798662 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.798851 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xx6rh" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.803740 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-zc9lq"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.840412 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmbm\" (UniqueName: \"kubernetes.io/projected/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-kube-api-access-8hmbm\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.840494 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.944170 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmbm\" (UniqueName: \"kubernetes.io/projected/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-kube-api-access-8hmbm\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.944843 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.954551 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.968410 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv"] Mar 18 17:21:11 crc kubenswrapper[4939]: I0318 17:21:11.978732 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmbm\" (UniqueName: \"kubernetes.io/projected/cdcc74fa-7b06-489c-bdfc-fe75965f4aa3-kube-api-access-8hmbm\") pod \"observability-operator-6dd7dd855f-zc9lq\" (UID: \"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3\") " pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.126832 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.176197 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" event={"ID":"b807b3bb-f1f2-43ad-8a15-359bff856ca7","Type":"ContainerStarted","Data":"dde462aefd74df8a9578819252a5f85cf4a0b68e0decf1bdbd80562ac773eb8f"} Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.261610 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk"] Mar 18 17:21:12 crc kubenswrapper[4939]: W0318 17:21:12.309635 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffde8c6_522e_491d_8f41_848c8a175529.slice/crio-b5329a46d034ac7136b37eb1ad534a83af4eb37901655661c28f447ac46fb27d WatchSource:0}: Error finding container b5329a46d034ac7136b37eb1ad534a83af4eb37901655661c28f447ac46fb27d: Status 404 returned error can't find the container with id b5329a46d034ac7136b37eb1ad534a83af4eb37901655661c28f447ac46fb27d Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.378260 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5d7759777-9ttbm"] Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.382175 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.393033 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-4xjzd" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.393263 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.410857 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5d7759777-9ttbm"] Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.496309 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpw8\" (UniqueName: \"kubernetes.io/projected/f2cfb2e8-f000-4606-8b03-9d82aebcc102-kube-api-access-8vpw8\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.496733 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-webhook-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.496824 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfb2e8-f000-4606-8b03-9d82aebcc102-openshift-service-ca\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.496912 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-apiservice-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.598909 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-webhook-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.599007 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfb2e8-f000-4606-8b03-9d82aebcc102-openshift-service-ca\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.599112 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-apiservice-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.599151 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpw8\" (UniqueName: \"kubernetes.io/projected/f2cfb2e8-f000-4606-8b03-9d82aebcc102-kube-api-access-8vpw8\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.600078 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfb2e8-f000-4606-8b03-9d82aebcc102-openshift-service-ca\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.606358 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-apiservice-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.606443 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cfb2e8-f000-4606-8b03-9d82aebcc102-webhook-cert\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.623534 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpw8\" (UniqueName: \"kubernetes.io/projected/f2cfb2e8-f000-4606-8b03-9d82aebcc102-kube-api-access-8vpw8\") pod \"perses-operator-5d7759777-9ttbm\" (UID: \"f2cfb2e8-f000-4606-8b03-9d82aebcc102\") " pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.707675 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz"] Mar 18 17:21:12 crc kubenswrapper[4939]: I0318 17:21:12.738638 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:13 crc kubenswrapper[4939]: W0318 17:21:13.018759 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcc74fa_7b06_489c_bdfc_fe75965f4aa3.slice/crio-1a5d6cc7146767a7b38085e89d8ac7c748b2c77da74be13013140f1467e1a940 WatchSource:0}: Error finding container 1a5d6cc7146767a7b38085e89d8ac7c748b2c77da74be13013140f1467e1a940: Status 404 returned error can't find the container with id 1a5d6cc7146767a7b38085e89d8ac7c748b2c77da74be13013140f1467e1a940 Mar 18 17:21:13 crc kubenswrapper[4939]: I0318 17:21:13.037862 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-zc9lq"] Mar 18 17:21:13 crc kubenswrapper[4939]: I0318 17:21:13.219767 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" event={"ID":"9ffde8c6-522e-491d-8f41-848c8a175529","Type":"ContainerStarted","Data":"b5329a46d034ac7136b37eb1ad534a83af4eb37901655661c28f447ac46fb27d"} Mar 18 17:21:13 crc kubenswrapper[4939]: I0318 17:21:13.273260 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" event={"ID":"fa00393f-d482-4806-8ec6-e25a9f306888","Type":"ContainerStarted","Data":"5fee315b903013f7fb05eb110a27a227faeac7d5dfd9cbc6df38ae522e6016a8"} Mar 18 17:21:13 crc kubenswrapper[4939]: I0318 17:21:13.302745 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" event={"ID":"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3","Type":"ContainerStarted","Data":"1a5d6cc7146767a7b38085e89d8ac7c748b2c77da74be13013140f1467e1a940"} Mar 18 17:21:13 crc kubenswrapper[4939]: I0318 17:21:13.426690 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5d7759777-9ttbm"] Mar 18 17:21:13 crc kubenswrapper[4939]: W0318 17:21:13.449140 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cfb2e8_f000_4606_8b03_9d82aebcc102.slice/crio-393afba3851bb2fbfd3fcf8a84733d2622f31037ae187554c8acaf8ed0505bf5 WatchSource:0}: Error finding container 393afba3851bb2fbfd3fcf8a84733d2622f31037ae187554c8acaf8ed0505bf5: Status 404 returned error can't find the container with id 393afba3851bb2fbfd3fcf8a84733d2622f31037ae187554c8acaf8ed0505bf5 Mar 18 17:21:14 crc kubenswrapper[4939]: I0318 17:21:14.333555 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5d7759777-9ttbm" event={"ID":"f2cfb2e8-f000-4606-8b03-9d82aebcc102","Type":"ContainerStarted","Data":"393afba3851bb2fbfd3fcf8a84733d2622f31037ae187554c8acaf8ed0505bf5"} Mar 18 17:21:19 crc kubenswrapper[4939]: I0318 17:21:19.396219 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:21:19 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:21:19 crc kubenswrapper[4939]: > Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.480977 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" event={"ID":"9ffde8c6-522e-491d-8f41-848c8a175529","Type":"ContainerStarted","Data":"7ea1278643f2767146ee66e7d5e3b0f41849d65850f02908840db55d8aedf554"} Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.487772 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" event={"ID":"fa00393f-d482-4806-8ec6-e25a9f306888","Type":"ContainerStarted","Data":"7cff746803d26c8fddd0d624f92119a627209e46b47c216bef6e9837db46572e"} Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.503692 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5d7759777-9ttbm" event={"ID":"f2cfb2e8-f000-4606-8b03-9d82aebcc102","Type":"ContainerStarted","Data":"6cce2b3bc606068bffcadcba15587c1cdfd0ae9ecb6ea5693bd2fdc0dd9480d1"} Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.504579 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.510315 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" event={"ID":"cdcc74fa-7b06-489c-bdfc-fe75965f4aa3","Type":"ContainerStarted","Data":"e4168298c58beb3097174abd59cd5d62f459023197cad98da61578d0d2984c5a"} Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.510773 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.512609 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" event={"ID":"b807b3bb-f1f2-43ad-8a15-359bff856ca7","Type":"ContainerStarted","Data":"5619fb7b5f86a7135ef69be2f0b16c3f39f6e23f9899876742663dcd6cbdf2a1"} Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.517219 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-dk6hk" podStartSLOduration=2.445012224 podStartE2EDuration="12.517201889s" podCreationTimestamp="2026-03-18 17:21:11 +0000 UTC" firstStartedPulling="2026-03-18 17:21:12.350449944 +0000 UTC m=+6236.949637565" lastFinishedPulling="2026-03-18 17:21:22.422639589 +0000 UTC m=+6247.021827230" observedRunningTime="2026-03-18 17:21:23.510305864 +0000 UTC m=+6248.109493485" watchObservedRunningTime="2026-03-18 17:21:23.517201889 +0000 UTC m=+6248.116389510" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.574075 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" podStartSLOduration=3.026357972 podStartE2EDuration="12.574052294s" podCreationTimestamp="2026-03-18 17:21:11 +0000 UTC" firstStartedPulling="2026-03-18 17:21:13.032761788 +0000 UTC m=+6237.631949409" lastFinishedPulling="2026-03-18 17:21:22.58045611 +0000 UTC m=+6247.179643731" observedRunningTime="2026-03-18 17:21:23.554881269 +0000 UTC m=+6248.154068880" watchObservedRunningTime="2026-03-18 17:21:23.574052294 +0000 UTC m=+6248.173239915" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.579357 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.610265 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5d7759777-9ttbm" podStartSLOduration=2.5389184460000003 podStartE2EDuration="11.610244231s" podCreationTimestamp="2026-03-18 17:21:12 +0000 UTC" firstStartedPulling="2026-03-18 17:21:13.451064936 +0000 UTC m=+6238.050252557" lastFinishedPulling="2026-03-18 17:21:22.522390721 +0000 UTC m=+6247.121578342" observedRunningTime="2026-03-18 17:21:23.604122898 +0000 UTC m=+6248.203310519" watchObservedRunningTime="2026-03-18 17:21:23.610244231 +0000 UTC m=+6248.209431852" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.664044 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-599799b8-zhbrz" podStartSLOduration=2.974719516 podStartE2EDuration="12.664024729s" podCreationTimestamp="2026-03-18 17:21:11 +0000 UTC" firstStartedPulling="2026-03-18 17:21:12.73313856 +0000 UTC m=+6237.332326181" lastFinishedPulling="2026-03-18 17:21:22.422443773 +0000 UTC m=+6247.021631394" observedRunningTime="2026-03-18 17:21:23.642102156 +0000 UTC m=+6248.241289787" watchObservedRunningTime="2026-03-18 17:21:23.664024729 +0000 UTC m=+6248.263212350" Mar 18 17:21:23 crc kubenswrapper[4939]: I0318 17:21:23.740127 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-pxtrv" podStartSLOduration=3.355720979 podStartE2EDuration="13.740105389s" podCreationTimestamp="2026-03-18 17:21:10 +0000 UTC" firstStartedPulling="2026-03-18 17:21:12.05557441 +0000 UTC m=+6236.654762041" lastFinishedPulling="2026-03-18 17:21:22.43995883 +0000 UTC m=+6247.039146451" observedRunningTime="2026-03-18 17:21:23.720471692 +0000 UTC m=+6248.319659313" watchObservedRunningTime="2026-03-18 17:21:23.740105389 +0000 UTC m=+6248.339293010" Mar 18 17:21:29 crc kubenswrapper[4939]: I0318 17:21:29.375248 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:21:29 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:21:29 crc kubenswrapper[4939]: > Mar 18 17:21:32 crc kubenswrapper[4939]: I0318 17:21:32.761820 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5d7759777-9ttbm" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.174176 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.174912 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" containerName="openstackclient" containerID="cri-o://75dafdb295f06a9a5f3e40734c13a7b852f093af6935ef326d534be4eb5dac68" gracePeriod=2 Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.188132 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.232235 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: E0318 17:21:38.232849 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" containerName="openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.232874 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" containerName="openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.233148 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" containerName="openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.234086 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.254235 4939 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd780f4f-4e27-490a-9b63-fd0861679b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T17:21:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T17:21:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T17:21:38Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T17:21:38Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4h87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T17:21:38Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.261468 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.280489 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: E0318 17:21:38.281641 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-x4h87 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-x4h87 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="fd780f4f-4e27-490a-9b63-fd0861679b16" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.318963 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.361281 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.371291 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.371374 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.371450 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4h87\" (UniqueName: \"kubernetes.io/projected/fd780f4f-4e27-490a-9b63-fd0861679b16-kube-api-access-x4h87\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.381011 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.383560 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.391793 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.402878 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fd780f4f-4e27-490a-9b63-fd0861679b16" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.428582 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.430324 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.434161 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lh7gq" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.463572 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481050 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481409 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwzp\" (UniqueName: \"kubernetes.io/projected/da1de069-c69e-4219-b479-3f2f6cdbee7b-kube-api-access-dfwzp\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481486 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481538 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481646 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4h87\" (UniqueName: \"kubernetes.io/projected/fd780f4f-4e27-490a-9b63-fd0861679b16-kube-api-access-x4h87\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.481713 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.485120 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.497688 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: E0318 17:21:38.510985 4939 projected.go:194] Error preparing data for projected volume kube-api-access-x4h87 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fd780f4f-4e27-490a-9b63-fd0861679b16) does not match the UID in record. The object might have been deleted and then recreated Mar 18 17:21:38 crc kubenswrapper[4939]: E0318 17:21:38.511055 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd780f4f-4e27-490a-9b63-fd0861679b16-kube-api-access-x4h87 podName:fd780f4f-4e27-490a-9b63-fd0861679b16 nodeName:}" failed. No retries permitted until 2026-03-18 17:21:39.011033913 +0000 UTC m=+6263.610221534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x4h87" (UniqueName: "kubernetes.io/projected/fd780f4f-4e27-490a-9b63-fd0861679b16-kube-api-access-x4h87") pod "openstackclient" (UID: "fd780f4f-4e27-490a-9b63-fd0861679b16") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fd780f4f-4e27-490a-9b63-fd0861679b16) does not match the UID in record. The object might have been deleted and then recreated Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.586009 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jkb\" (UniqueName: \"kubernetes.io/projected/37664671-b80f-472e-bd20-b23186d6e808-kube-api-access-j9jkb\") pod \"kube-state-metrics-0\" (UID: \"37664671-b80f-472e-bd20-b23186d6e808\") " pod="openstack/kube-state-metrics-0" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.586086 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.586196 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwzp\" (UniqueName: \"kubernetes.io/projected/da1de069-c69e-4219-b479-3f2f6cdbee7b-kube-api-access-dfwzp\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.586235 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.587124 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.594817 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1de069-c69e-4219-b479-3f2f6cdbee7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.628283 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwzp\" (UniqueName: \"kubernetes.io/projected/da1de069-c69e-4219-b479-3f2f6cdbee7b-kube-api-access-dfwzp\") pod \"openstackclient\" (UID: \"da1de069-c69e-4219-b479-3f2f6cdbee7b\") " pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.665041 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.678928 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fd780f4f-4e27-490a-9b63-fd0861679b16" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.690387 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jkb\" (UniqueName: \"kubernetes.io/projected/37664671-b80f-472e-bd20-b23186d6e808-kube-api-access-j9jkb\") pod \"kube-state-metrics-0\" (UID: \"37664671-b80f-472e-bd20-b23186d6e808\") " pod="openstack/kube-state-metrics-0" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.716272 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.729704 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fd780f4f-4e27-490a-9b63-fd0861679b16" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.733208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jkb\" (UniqueName: \"kubernetes.io/projected/37664671-b80f-472e-bd20-b23186d6e808-kube-api-access-j9jkb\") pod \"kube-state-metrics-0\" (UID: \"37664671-b80f-472e-bd20-b23186d6e808\") " pod="openstack/kube-state-metrics-0" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.754246 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.793433 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.794292 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret\") pod \"fd780f4f-4e27-490a-9b63-fd0861679b16\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.794375 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config\") pod \"fd780f4f-4e27-490a-9b63-fd0861679b16\" (UID: \"fd780f4f-4e27-490a-9b63-fd0861679b16\") " Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.794987 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4h87\" (UniqueName: \"kubernetes.io/projected/fd780f4f-4e27-490a-9b63-fd0861679b16-kube-api-access-x4h87\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.795390 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fd780f4f-4e27-490a-9b63-fd0861679b16" (UID: "fd780f4f-4e27-490a-9b63-fd0861679b16"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.807629 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fd780f4f-4e27-490a-9b63-fd0861679b16" (UID: "fd780f4f-4e27-490a-9b63-fd0861679b16"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.898534 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:38 crc kubenswrapper[4939]: I0318 17:21:38.898799 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd780f4f-4e27-490a-9b63-fd0861679b16-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.304110 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.333070 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.339254 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-52nch" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.339476 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.339678 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.339852 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.339991 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.357006 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:21:39 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:21:39 crc kubenswrapper[4939]: > Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.398996 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.412916 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413007 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413047 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg96k\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-kube-api-access-bg96k\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413068 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413092 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413107 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.413123 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516185 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg96k\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-kube-api-access-bg96k\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516241 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516279 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516301 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516324 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516465 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.516587 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.517227 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.535477 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.536131 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.537042 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.542039 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8852984-80af-496e-b8c0-95374efe6bff-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.558399 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8852984-80af-496e-b8c0-95374efe6bff-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.593732 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg96k\" (UniqueName: \"kubernetes.io/projected/a8852984-80af-496e-b8c0-95374efe6bff-kube-api-access-bg96k\") pod \"alertmanager-metric-storage-0\" (UID: \"a8852984-80af-496e-b8c0-95374efe6bff\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.674620 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.695956 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fd780f4f-4e27-490a-9b63-fd0861679b16" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:39 crc kubenswrapper[4939]: I0318 17:21:39.701315 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.147931 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd780f4f-4e27-490a-9b63-fd0861679b16" path="/var/lib/kubelet/pods/fd780f4f-4e27-490a-9b63-fd0861679b16/volumes" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.315723 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.322151 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.362985 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.367789 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.367922 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.363166 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.363226 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-vjdmm" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.363415 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.363489 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.387766 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.436329 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.453420 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.473617 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.474619 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.474744 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.474834 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.474904 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0993e3cb-7df8-4774-a056-425d5d6a5f35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475007 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475102 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475220 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475299 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.475440 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjc4n\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-kube-api-access-zjc4n\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.527047 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.576857 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.576919 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.576943 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.576962 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0993e3cb-7df8-4774-a056-425d5d6a5f35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577006 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577034 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577111 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577134 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.577156 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjc4n\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-kube-api-access-zjc4n\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.578176 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.579299 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.579959 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0993e3cb-7df8-4774-a056-425d5d6a5f35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.588395 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.589265 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.595159 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.595492 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0993e3cb-7df8-4774-a056-425d5d6a5f35-config\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.597101 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0993e3cb-7df8-4774-a056-425d5d6a5f35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.597607 4939 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.597689 4939 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bde518d469ddd806fdd0952f16a590ce2cb64d1096de43f5e3059d1be2b4db96/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.614386 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjc4n\" (UniqueName: \"kubernetes.io/projected/0993e3cb-7df8-4774-a056-425d5d6a5f35-kube-api-access-zjc4n\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.695906 4939 generic.go:334] "Generic (PLEG): container finished" podID="53b9939b-2d68-4c16-9e00-bed4c66cd226" containerID="75dafdb295f06a9a5f3e40734c13a7b852f093af6935ef326d534be4eb5dac68" exitCode=137 Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.697925 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37664671-b80f-472e-bd20-b23186d6e808","Type":"ContainerStarted","Data":"f0423d4f36697c874a97ef9045f38c5212c8e0ae442a3f089700cd9f31b0005d"} Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.699147 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da1de069-c69e-4219-b479-3f2f6cdbee7b","Type":"ContainerStarted","Data":"1b78c689bd3939c416111601caf619d05842bd434213e0d99a25346ff4b12142"} Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.700659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8852984-80af-496e-b8c0-95374efe6bff","Type":"ContainerStarted","Data":"b931905bdc83f7e89fec0372381c0607169299c347b1e6f7afe64f4f36d58bdf"} Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.750656 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1f36b94-13fb-4f9c-8a7b-36160fda3ada\") pod \"prometheus-metric-storage-0\" (UID: \"0993e3cb-7df8-4774-a056-425d5d6a5f35\") " pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.774877 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.893042 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret\") pod \"53b9939b-2d68-4c16-9e00-bed4c66cd226\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.893106 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr4dt\" (UniqueName: \"kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt\") pod \"53b9939b-2d68-4c16-9e00-bed4c66cd226\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.893196 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config\") pod \"53b9939b-2d68-4c16-9e00-bed4c66cd226\" (UID: \"53b9939b-2d68-4c16-9e00-bed4c66cd226\") " Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.913015 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt" (OuterVolumeSpecName: "kube-api-access-fr4dt") pod "53b9939b-2d68-4c16-9e00-bed4c66cd226" (UID: "53b9939b-2d68-4c16-9e00-bed4c66cd226"). InnerVolumeSpecName "kube-api-access-fr4dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.942899 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "53b9939b-2d68-4c16-9e00-bed4c66cd226" (UID: "53b9939b-2d68-4c16-9e00-bed4c66cd226"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.973230 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "53b9939b-2d68-4c16-9e00-bed4c66cd226" (UID: "53b9939b-2d68-4c16-9e00-bed4c66cd226"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.986990 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.995103 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.995142 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr4dt\" (UniqueName: \"kubernetes.io/projected/53b9939b-2d68-4c16-9e00-bed4c66cd226-kube-api-access-fr4dt\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:40 crc kubenswrapper[4939]: I0318 17:21:40.995158 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/53b9939b-2d68-4c16-9e00-bed4c66cd226-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.630819 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.716876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerStarted","Data":"cf5d032a2f843dd5c877504d5dd8a0e79910261cf4e015d164c88f39d803980d"} Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.718774 4939 scope.go:117] "RemoveContainer" containerID="75dafdb295f06a9a5f3e40734c13a7b852f093af6935ef326d534be4eb5dac68" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.718817 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.720163 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37664671-b80f-472e-bd20-b23186d6e808","Type":"ContainerStarted","Data":"f587266e65baad42912e7d0e29f15ed83334b932a2dd50211e155330cb4e4619"} Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.720217 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.725752 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da1de069-c69e-4219-b479-3f2f6cdbee7b","Type":"ContainerStarted","Data":"cb9e3e6fb31e4cb77888157bf81377b4565f4be7ee85c2bbf70ebcd20a6fbc10"} Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.751472 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.084282723 podStartE2EDuration="3.751449139s" podCreationTimestamp="2026-03-18 17:21:38 +0000 UTC" firstStartedPulling="2026-03-18 17:21:40.399678194 +0000 UTC m=+6264.998865815" lastFinishedPulling="2026-03-18 17:21:41.06684461 +0000 UTC m=+6265.666032231" observedRunningTime="2026-03-18 17:21:41.748123785 +0000 UTC m=+6266.347311416" watchObservedRunningTime="2026-03-18 17:21:41.751449139 +0000 UTC m=+6266.350636760" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.776304 4939 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" podUID="da1de069-c69e-4219-b479-3f2f6cdbee7b" Mar 18 17:21:41 crc kubenswrapper[4939]: I0318 17:21:41.782751 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.782725737 podStartE2EDuration="3.782725737s" podCreationTimestamp="2026-03-18 17:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:21:41.772863597 +0000 UTC m=+6266.372051218" watchObservedRunningTime="2026-03-18 17:21:41.782725737 +0000 UTC m=+6266.381913358" Mar 18 17:21:42 crc kubenswrapper[4939]: I0318 17:21:42.162905 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b9939b-2d68-4c16-9e00-bed4c66cd226" path="/var/lib/kubelet/pods/53b9939b-2d68-4c16-9e00-bed4c66cd226/volumes" Mar 18 17:21:47 crc kubenswrapper[4939]: I0318 17:21:47.783148 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerStarted","Data":"8870570e90f92510de8892585bc86fef524f0cf31abb3bd659b57135c6e52836"} Mar 18 17:21:48 crc kubenswrapper[4939]: I0318 17:21:48.792798 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8852984-80af-496e-b8c0-95374efe6bff","Type":"ContainerStarted","Data":"e50ccc482739c8deacf0fcfd38eada81da79c0162d5ade3d4926eaf06e99c569"} Mar 18 17:21:48 crc kubenswrapper[4939]: I0318 17:21:48.798195 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 17:21:49 crc kubenswrapper[4939]: I0318 17:21:49.317677 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:21:49 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:21:49 crc kubenswrapper[4939]: > Mar 18 17:21:51 crc kubenswrapper[4939]: I0318 17:21:51.044933 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wgltd"] Mar 18 17:21:51 crc kubenswrapper[4939]: I0318 17:21:51.065175 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wgltd"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.047276 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ab44-account-create-update-p6cph"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.060307 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ab44-account-create-update-p6cph"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.075004 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9glh5"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.083373 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6a0e-account-create-update-v88nc"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.093582 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-2gnt7"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.102158 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1773-account-create-update-7b2m6"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.112775 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9glh5"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.121002 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6a0e-account-create-update-v88nc"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.129647 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-2gnt7"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.157053 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ab6999-cfdd-4db2-8c1b-73f3e29d38eb" path="/var/lib/kubelet/pods/11ab6999-cfdd-4db2-8c1b-73f3e29d38eb/volumes" Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.157952 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a95dc7-b051-4cb1-986f-7f85876e0168" path="/var/lib/kubelet/pods/38a95dc7-b051-4cb1-986f-7f85876e0168/volumes" Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.159477 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91381de6-d500-4d85-b3de-4665b1b2e023" path="/var/lib/kubelet/pods/91381de6-d500-4d85-b3de-4665b1b2e023/volumes" Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.160963 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ef70d1-94b3-4941-adda-70d5d62b25a5" path="/var/lib/kubelet/pods/97ef70d1-94b3-4941-adda-70d5d62b25a5/volumes" Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.162635 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd432bf0-b938-41fe-93d2-58613baa56aa" path="/var/lib/kubelet/pods/dd432bf0-b938-41fe-93d2-58613baa56aa/volumes" Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.163394 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1773-account-create-update-7b2m6"] Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.834029 4939 generic.go:334] "Generic (PLEG): container finished" podID="0993e3cb-7df8-4774-a056-425d5d6a5f35" containerID="8870570e90f92510de8892585bc86fef524f0cf31abb3bd659b57135c6e52836" exitCode=0 Mar 18 17:21:52 crc kubenswrapper[4939]: I0318 17:21:52.834071 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerDied","Data":"8870570e90f92510de8892585bc86fef524f0cf31abb3bd659b57135c6e52836"} Mar 18 17:21:53 crc kubenswrapper[4939]: I0318 17:21:53.844487 4939 generic.go:334] "Generic (PLEG): container finished" podID="a8852984-80af-496e-b8c0-95374efe6bff" containerID="e50ccc482739c8deacf0fcfd38eada81da79c0162d5ade3d4926eaf06e99c569" exitCode=0 Mar 18 17:21:53 crc kubenswrapper[4939]: I0318 17:21:53.844541 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8852984-80af-496e-b8c0-95374efe6bff","Type":"ContainerDied","Data":"e50ccc482739c8deacf0fcfd38eada81da79c0162d5ade3d4926eaf06e99c569"} Mar 18 17:21:54 crc kubenswrapper[4939]: I0318 17:21:54.147257 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d3b859-107b-4311-bb6a-74236113699d" path="/var/lib/kubelet/pods/b8d3b859-107b-4311-bb6a-74236113699d/volumes" Mar 18 17:21:56 crc kubenswrapper[4939]: I0318 17:21:56.872969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8852984-80af-496e-b8c0-95374efe6bff","Type":"ContainerStarted","Data":"bbb8248b3a49a777fc68906e61144ac40bd65c7eaea86da0c7fe75871f5be1bd"} Mar 18 17:21:58 crc kubenswrapper[4939]: I0318 17:21:58.322779 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:21:58 crc kubenswrapper[4939]: I0318 17:21:58.389838 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:21:58 crc kubenswrapper[4939]: I0318 17:21:58.564490 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:21:59 crc kubenswrapper[4939]: I0318 17:21:59.896877 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zb4ns" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" containerID="cri-o://9c6550d30d9f5f4688d76686020cb0dc50ce7b31bfb030da35c26b2b94f8fcbd" gracePeriod=2 Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.245170 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564242-s5t2d"] Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.246761 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.259763 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.260058 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.260070 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.263918 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-s5t2d"] Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.409626 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbtrx\" (UniqueName: \"kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx\") pod \"auto-csr-approver-29564242-s5t2d\" (UID: \"5368c541-c217-48f1-9c31-bf094896cad0\") " pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.512394 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbtrx\" (UniqueName: \"kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx\") pod \"auto-csr-approver-29564242-s5t2d\" (UID: \"5368c541-c217-48f1-9c31-bf094896cad0\") " pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.532351 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbtrx\" (UniqueName: \"kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx\") pod \"auto-csr-approver-29564242-s5t2d\" (UID: \"5368c541-c217-48f1-9c31-bf094896cad0\") " pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.618275 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.908472 4939 generic.go:334] "Generic (PLEG): container finished" podID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerID="9c6550d30d9f5f4688d76686020cb0dc50ce7b31bfb030da35c26b2b94f8fcbd" exitCode=0 Mar 18 17:22:00 crc kubenswrapper[4939]: I0318 17:22:00.908544 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerDied","Data":"9c6550d30d9f5f4688d76686020cb0dc50ce7b31bfb030da35c26b2b94f8fcbd"} Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.937983 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.963552 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerStarted","Data":"95cd0c91ee62aaefc917265ebce1cd0994bfcd7fa7ba4e10a5f7e82401908b89"} Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.989124 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zb4ns" event={"ID":"b85b2527-0783-44ee-9eae-f9b4601d1c9d","Type":"ContainerDied","Data":"08452f235146c6116e2a920ae00b802965f34ed26aa77674710a5e4d4cdb69e0"} Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.989176 4939 scope.go:117] "RemoveContainer" containerID="9c6550d30d9f5f4688d76686020cb0dc50ce7b31bfb030da35c26b2b94f8fcbd" Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.989142 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zb4ns" Mar 18 17:22:01 crc kubenswrapper[4939]: I0318 17:22:01.999685 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8852984-80af-496e-b8c0-95374efe6bff","Type":"ContainerStarted","Data":"a2ed68316145c54281d220eabac0a8523127557e1bbaac84ffbe2b743dd80321"} Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.000570 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.005579 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.028533 4939 scope.go:117] "RemoveContainer" containerID="a8ad4d48dfb625214f7c07a166efa918eabbdca6eb7cd0dc4e53edc9eb79cb04" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.048079 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.125284522 podStartE2EDuration="23.048027343s" podCreationTimestamp="2026-03-18 17:21:39 +0000 UTC" firstStartedPulling="2026-03-18 17:21:40.470449844 +0000 UTC m=+6265.069637465" lastFinishedPulling="2026-03-18 17:21:56.393192665 +0000 UTC m=+6280.992380286" observedRunningTime="2026-03-18 17:22:02.040051707 +0000 UTC m=+6286.639239328" watchObservedRunningTime="2026-03-18 17:22:02.048027343 +0000 UTC m=+6286.647214964" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.049333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content\") pod \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.049687 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities\") pod \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.049919 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6tqq\" (UniqueName: \"kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq\") pod \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\" (UID: \"b85b2527-0783-44ee-9eae-f9b4601d1c9d\") " Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.050497 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities" (OuterVolumeSpecName: "utilities") pod "b85b2527-0783-44ee-9eae-f9b4601d1c9d" (UID: "b85b2527-0783-44ee-9eae-f9b4601d1c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.050651 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.061005 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq" (OuterVolumeSpecName: "kube-api-access-t6tqq") pod "b85b2527-0783-44ee-9eae-f9b4601d1c9d" (UID: "b85b2527-0783-44ee-9eae-f9b4601d1c9d"). InnerVolumeSpecName "kube-api-access-t6tqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.063473 4939 scope.go:117] "RemoveContainer" containerID="950a5b29fef364ab64f2c3af4a3676bae64aba8ef76ad1e2c293d6848c05fe42" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.067463 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-s5t2d"] Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.152883 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6tqq\" (UniqueName: \"kubernetes.io/projected/b85b2527-0783-44ee-9eae-f9b4601d1c9d-kube-api-access-t6tqq\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.226009 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b85b2527-0783-44ee-9eae-f9b4601d1c9d" (UID: "b85b2527-0783-44ee-9eae-f9b4601d1c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.255335 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2527-0783-44ee-9eae-f9b4601d1c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.322636 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:22:02 crc kubenswrapper[4939]: I0318 17:22:02.332280 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zb4ns"] Mar 18 17:22:03 crc kubenswrapper[4939]: I0318 17:22:03.032001 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" event={"ID":"5368c541-c217-48f1-9c31-bf094896cad0","Type":"ContainerStarted","Data":"bcc91b7522e2de9f02068a9d1f8d5cd301004b620e96ef3393c2816a9137df9c"} Mar 18 17:22:03 crc kubenswrapper[4939]: I0318 17:22:03.057698 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xcwnv"] Mar 18 17:22:03 crc kubenswrapper[4939]: I0318 17:22:03.068320 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xcwnv"] Mar 18 17:22:04 crc kubenswrapper[4939]: I0318 17:22:04.046634 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" event={"ID":"5368c541-c217-48f1-9c31-bf094896cad0","Type":"ContainerStarted","Data":"3226fdbc8d348f29f0abe28cf3d71a1f2fdadea5fc4c479171a79f8f617f2a3e"} Mar 18 17:22:04 crc kubenswrapper[4939]: I0318 17:22:04.067003 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" podStartSLOduration=3.208203679 podStartE2EDuration="4.066986074s" podCreationTimestamp="2026-03-18 17:22:00 +0000 UTC" firstStartedPulling="2026-03-18 17:22:02.082133482 +0000 UTC m=+6286.681321103" lastFinishedPulling="2026-03-18 17:22:02.940915877 +0000 UTC m=+6287.540103498" observedRunningTime="2026-03-18 17:22:04.059247504 +0000 UTC m=+6288.658435125" watchObservedRunningTime="2026-03-18 17:22:04.066986074 +0000 UTC m=+6288.666173695" Mar 18 17:22:04 crc kubenswrapper[4939]: I0318 17:22:04.145860 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8eac24-43dc-491c-9aed-367fef1ff6fb" path="/var/lib/kubelet/pods/3d8eac24-43dc-491c-9aed-367fef1ff6fb/volumes" Mar 18 17:22:04 crc kubenswrapper[4939]: I0318 17:22:04.146543 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" path="/var/lib/kubelet/pods/b85b2527-0783-44ee-9eae-f9b4601d1c9d/volumes" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.057612 4939 generic.go:334] "Generic (PLEG): container finished" podID="5368c541-c217-48f1-9c31-bf094896cad0" containerID="3226fdbc8d348f29f0abe28cf3d71a1f2fdadea5fc4c479171a79f8f617f2a3e" exitCode=0 Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.057659 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" event={"ID":"5368c541-c217-48f1-9c31-bf094896cad0","Type":"ContainerDied","Data":"3226fdbc8d348f29f0abe28cf3d71a1f2fdadea5fc4c479171a79f8f617f2a3e"} Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.689762 4939 scope.go:117] "RemoveContainer" containerID="294e5fd4db2158939becc2e22eefd5648a11a2c5e3ffbf936c2c399728fe1742" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.717995 4939 scope.go:117] "RemoveContainer" containerID="f0180b7f6dc28fab6e07761df562d3971873b3d9f705f062ae3c63114e47d6b8" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.794651 4939 scope.go:117] "RemoveContainer" containerID="df8c8c9e8368314f778878c496e6a0e3fe614565bab2ab15edad28da6bc457f2" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.828262 4939 scope.go:117] "RemoveContainer" containerID="6d3abacc1c70322bfc4090aac1971eb71c8c5ed082d3c8fd1d66a345225707f4" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.868331 4939 scope.go:117] "RemoveContainer" containerID="3461ac15b77094dc51b83b42b0177e246bfa10279c9f5b9fa849d77057122716" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.911977 4939 scope.go:117] "RemoveContainer" containerID="4d988e6cc5115d69e8c98ce8898c4469c663c841f82cf558fbe09b53ccdff6d3" Mar 18 17:22:05 crc kubenswrapper[4939]: I0318 17:22:05.954033 4939 scope.go:117] "RemoveContainer" containerID="79251cf53ec5a26cb34df02d6aaeda87a39ae4ba6a8e28b9654b2674a34bda64" Mar 18 17:22:06 crc kubenswrapper[4939]: I0318 17:22:06.076175 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerStarted","Data":"7471399067b6aa05f0e5b6d6ed2f81429a88c731ec824f0056a44f798f9bcc94"} Mar 18 17:22:06 crc kubenswrapper[4939]: I0318 17:22:06.385969 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:06 crc kubenswrapper[4939]: I0318 17:22:06.544056 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbtrx\" (UniqueName: \"kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx\") pod \"5368c541-c217-48f1-9c31-bf094896cad0\" (UID: \"5368c541-c217-48f1-9c31-bf094896cad0\") " Mar 18 17:22:06 crc kubenswrapper[4939]: I0318 17:22:06.552621 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx" (OuterVolumeSpecName: "kube-api-access-kbtrx") pod "5368c541-c217-48f1-9c31-bf094896cad0" (UID: "5368c541-c217-48f1-9c31-bf094896cad0"). InnerVolumeSpecName "kube-api-access-kbtrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:06 crc kubenswrapper[4939]: I0318 17:22:06.646426 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbtrx\" (UniqueName: \"kubernetes.io/projected/5368c541-c217-48f1-9c31-bf094896cad0-kube-api-access-kbtrx\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:07 crc kubenswrapper[4939]: I0318 17:22:07.096925 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" event={"ID":"5368c541-c217-48f1-9c31-bf094896cad0","Type":"ContainerDied","Data":"bcc91b7522e2de9f02068a9d1f8d5cd301004b620e96ef3393c2816a9137df9c"} Mar 18 17:22:07 crc kubenswrapper[4939]: I0318 17:22:07.096967 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc91b7522e2de9f02068a9d1f8d5cd301004b620e96ef3393c2816a9137df9c" Mar 18 17:22:07 crc kubenswrapper[4939]: I0318 17:22:07.097026 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-s5t2d" Mar 18 17:22:07 crc kubenswrapper[4939]: I0318 17:22:07.113579 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-6zhpw"] Mar 18 17:22:07 crc kubenswrapper[4939]: I0318 17:22:07.122535 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-6zhpw"] Mar 18 17:22:08 crc kubenswrapper[4939]: I0318 17:22:08.154535 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60071088-a0a4-4fda-8f28-7b0f894191fe" path="/var/lib/kubelet/pods/60071088-a0a4-4fda-8f28-7b0f894191fe/volumes" Mar 18 17:22:09 crc kubenswrapper[4939]: I0318 17:22:09.116234 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0993e3cb-7df8-4774-a056-425d5d6a5f35","Type":"ContainerStarted","Data":"b7361e0cef98c23130ef70855f78e6c75f4dbf0279c67942b8b230e133deac9e"} Mar 18 17:22:09 crc kubenswrapper[4939]: I0318 17:22:09.156675 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.546232069 podStartE2EDuration="30.156656294s" podCreationTimestamp="2026-03-18 17:21:39 +0000 UTC" firstStartedPulling="2026-03-18 17:21:41.642027482 +0000 UTC m=+6266.241215103" lastFinishedPulling="2026-03-18 17:22:08.252451707 +0000 UTC m=+6292.851639328" observedRunningTime="2026-03-18 17:22:09.14137426 +0000 UTC m=+6293.740561881" watchObservedRunningTime="2026-03-18 17:22:09.156656294 +0000 UTC m=+6293.755843915" Mar 18 17:22:10 crc kubenswrapper[4939]: I0318 17:22:10.988070 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 17:22:10 crc kubenswrapper[4939]: I0318 17:22:10.988420 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 17:22:10 crc kubenswrapper[4939]: I0318 17:22:10.991156 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 17:22:11 crc kubenswrapper[4939]: I0318 17:22:11.136729 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.315994 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:12 crc kubenswrapper[4939]: E0318 17:22:12.316463 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5368c541-c217-48f1-9c31-bf094896cad0" containerName="oc" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316489 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5368c541-c217-48f1-9c31-bf094896cad0" containerName="oc" Mar 18 17:22:12 crc kubenswrapper[4939]: E0318 17:22:12.316517 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="extract-content" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316558 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="extract-content" Mar 18 17:22:12 crc kubenswrapper[4939]: E0318 17:22:12.316600 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316605 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" Mar 18 17:22:12 crc kubenswrapper[4939]: E0318 17:22:12.316618 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="extract-utilities" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316624 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="extract-utilities" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316810 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5368c541-c217-48f1-9c31-bf094896cad0" containerName="oc" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.316819 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85b2527-0783-44ee-9eae-f9b4601d1c9d" containerName="registry-server" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.318611 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.321621 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.321808 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.328647 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466618 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466672 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz4m5\" (UniqueName: \"kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466714 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466738 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466823 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466850 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.466872 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.568838 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.568912 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.568939 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.568993 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.569019 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz4m5\" (UniqueName: \"kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.569061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.569082 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.569604 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.569647 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.579685 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.580230 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.580989 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.582855 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.589369 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz4m5\" (UniqueName: \"kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5\") pod \"ceilometer-0\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " pod="openstack/ceilometer-0" Mar 18 17:22:12 crc kubenswrapper[4939]: I0318 17:22:12.639860 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:13 crc kubenswrapper[4939]: I0318 17:22:13.127472 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:13 crc kubenswrapper[4939]: W0318 17:22:13.127921 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f03694_5c86_48d7_9e0b_3c37caa83041.slice/crio-27cab990d5f54899f81502994e06e10ff3b9048aea5760825511ad31be3ab56f WatchSource:0}: Error finding container 27cab990d5f54899f81502994e06e10ff3b9048aea5760825511ad31be3ab56f: Status 404 returned error can't find the container with id 27cab990d5f54899f81502994e06e10ff3b9048aea5760825511ad31be3ab56f Mar 18 17:22:13 crc kubenswrapper[4939]: I0318 17:22:13.131344 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:22:13 crc kubenswrapper[4939]: I0318 17:22:13.154964 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerStarted","Data":"27cab990d5f54899f81502994e06e10ff3b9048aea5760825511ad31be3ab56f"} Mar 18 17:22:14 crc kubenswrapper[4939]: I0318 17:22:14.165714 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerStarted","Data":"e19c50e14be8de5a2b8fbe16341ef17af124aa88c0b95075f9d4409aeb58724f"} Mar 18 17:22:15 crc kubenswrapper[4939]: I0318 17:22:15.174731 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerStarted","Data":"c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1"} Mar 18 17:22:16 crc kubenswrapper[4939]: I0318 17:22:16.245444 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerStarted","Data":"bb5f889e8ea6e524a10ae11df989e5f7084673177087b0ed111d5ed4d6dbcb22"} Mar 18 17:22:18 crc kubenswrapper[4939]: I0318 17:22:18.270448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerStarted","Data":"f3b24225123852245e9fc843c729d2c6f0a523965c85219bd12b92cdee13e4ce"} Mar 18 17:22:18 crc kubenswrapper[4939]: I0318 17:22:18.271057 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 17:22:18 crc kubenswrapper[4939]: I0318 17:22:18.297417 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.031171624 podStartE2EDuration="6.297395559s" podCreationTimestamp="2026-03-18 17:22:12 +0000 UTC" firstStartedPulling="2026-03-18 17:22:13.131109245 +0000 UTC m=+6297.730296876" lastFinishedPulling="2026-03-18 17:22:17.39733316 +0000 UTC m=+6301.996520811" observedRunningTime="2026-03-18 17:22:18.294701322 +0000 UTC m=+6302.893888943" watchObservedRunningTime="2026-03-18 17:22:18.297395559 +0000 UTC m=+6302.896583180" Mar 18 17:22:21 crc kubenswrapper[4939]: I0318 17:22:21.041807 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5cpf4"] Mar 18 17:22:21 crc kubenswrapper[4939]: I0318 17:22:21.053356 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5cpf4"] Mar 18 17:22:22 crc kubenswrapper[4939]: I0318 17:22:22.034510 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-26vcj"] Mar 18 17:22:22 crc kubenswrapper[4939]: I0318 17:22:22.045317 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-26vcj"] Mar 18 17:22:22 crc kubenswrapper[4939]: I0318 17:22:22.152513 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000a84e0-4410-4303-9bea-3ad28e8c9c99" path="/var/lib/kubelet/pods/000a84e0-4410-4303-9bea-3ad28e8c9c99/volumes" Mar 18 17:22:22 crc kubenswrapper[4939]: I0318 17:22:22.155123 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1195c909-085a-458c-8366-46ef24721def" path="/var/lib/kubelet/pods/1195c909-085a-458c-8366-46ef24721def/volumes" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.770452 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-fgrm6"] Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.772102 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.786147 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fgrm6"] Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.825383 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssfn7\" (UniqueName: \"kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.825521 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.876481 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-36a9-account-create-update-7xbnx"] Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.877890 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.883182 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.893066 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-36a9-account-create-update-7xbnx"] Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.947091 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.949581 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssfn7\" (UniqueName: \"kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.949648 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct9w5\" (UniqueName: \"kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.949754 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.950417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:23 crc kubenswrapper[4939]: I0318 17:22:23.973057 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssfn7\" (UniqueName: \"kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7\") pod \"aodh-db-create-fgrm6\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.051026 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.051161 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct9w5\" (UniqueName: \"kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.051886 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.072610 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct9w5\" (UniqueName: \"kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5\") pod \"aodh-36a9-account-create-update-7xbnx\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.096862 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.253582 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.559732 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fgrm6"] Mar 18 17:22:24 crc kubenswrapper[4939]: W0318 17:22:24.566144 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb07a0b_8d95_4512_bf1a_ab64f702f8e7.slice/crio-bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9 WatchSource:0}: Error finding container bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9: Status 404 returned error can't find the container with id bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9 Mar 18 17:22:24 crc kubenswrapper[4939]: I0318 17:22:24.768041 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-36a9-account-create-update-7xbnx"] Mar 18 17:22:24 crc kubenswrapper[4939]: W0318 17:22:24.768997 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a70d715_2f49_4a60_9d49_5ee1808f5d89.slice/crio-475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041 WatchSource:0}: Error finding container 475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041: Status 404 returned error can't find the container with id 475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041 Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.388471 4939 generic.go:334] "Generic (PLEG): container finished" podID="3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" containerID="e0f84934e34c1a2e41bc5c44c5776b1c44c7c245eb806e004ae50fa573502fe7" exitCode=0 Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.388676 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fgrm6" event={"ID":"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7","Type":"ContainerDied","Data":"e0f84934e34c1a2e41bc5c44c5776b1c44c7c245eb806e004ae50fa573502fe7"} Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.389495 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fgrm6" event={"ID":"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7","Type":"ContainerStarted","Data":"bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9"} Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.393156 4939 generic.go:334] "Generic (PLEG): container finished" podID="1a70d715-2f49-4a60-9d49-5ee1808f5d89" containerID="3c9ee260eada23c451002c6c6b412b1d665e989d923a42b56867f893fb0ff438" exitCode=0 Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.393184 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-36a9-account-create-update-7xbnx" event={"ID":"1a70d715-2f49-4a60-9d49-5ee1808f5d89","Type":"ContainerDied","Data":"3c9ee260eada23c451002c6c6b412b1d665e989d923a42b56867f893fb0ff438"} Mar 18 17:22:25 crc kubenswrapper[4939]: I0318 17:22:25.393201 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-36a9-account-create-update-7xbnx" event={"ID":"1a70d715-2f49-4a60-9d49-5ee1808f5d89","Type":"ContainerStarted","Data":"475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041"} Mar 18 17:22:26 crc kubenswrapper[4939]: I0318 17:22:26.991765 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:26 crc kubenswrapper[4939]: I0318 17:22:26.999654 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.121675 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts\") pod \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.121755 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct9w5\" (UniqueName: \"kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5\") pod \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\" (UID: \"1a70d715-2f49-4a60-9d49-5ee1808f5d89\") " Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.121822 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssfn7\" (UniqueName: \"kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7\") pod \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.122020 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts\") pod \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\" (UID: \"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7\") " Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.122548 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a70d715-2f49-4a60-9d49-5ee1808f5d89" (UID: "1a70d715-2f49-4a60-9d49-5ee1808f5d89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.122721 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" (UID: "3eb07a0b-8d95-4512-bf1a-ab64f702f8e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.129037 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7" (OuterVolumeSpecName: "kube-api-access-ssfn7") pod "3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" (UID: "3eb07a0b-8d95-4512-bf1a-ab64f702f8e7"). InnerVolumeSpecName "kube-api-access-ssfn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.129726 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5" (OuterVolumeSpecName: "kube-api-access-ct9w5") pod "1a70d715-2f49-4a60-9d49-5ee1808f5d89" (UID: "1a70d715-2f49-4a60-9d49-5ee1808f5d89"). InnerVolumeSpecName "kube-api-access-ct9w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.224462 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.224492 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a70d715-2f49-4a60-9d49-5ee1808f5d89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.224504 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct9w5\" (UniqueName: \"kubernetes.io/projected/1a70d715-2f49-4a60-9d49-5ee1808f5d89-kube-api-access-ct9w5\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.224525 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssfn7\" (UniqueName: \"kubernetes.io/projected/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7-kube-api-access-ssfn7\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.424580 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fgrm6" event={"ID":"3eb07a0b-8d95-4512-bf1a-ab64f702f8e7","Type":"ContainerDied","Data":"bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9"} Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.424617 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fgrm6" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.424619 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1540be9b2df726bcfe4ac2b9bd991d29b611c1daddea639210d6efccb1a5c9" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.426264 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-36a9-account-create-update-7xbnx" Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.426220 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-36a9-account-create-update-7xbnx" event={"ID":"1a70d715-2f49-4a60-9d49-5ee1808f5d89","Type":"ContainerDied","Data":"475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041"} Mar 18 17:22:27 crc kubenswrapper[4939]: I0318 17:22:27.426556 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475670466ead38f7be003c1f8478f97a3969bb19c789329a667cd675a6194041" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.303200 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-f7rb6"] Mar 18 17:22:29 crc kubenswrapper[4939]: E0318 17:22:29.303916 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a70d715-2f49-4a60-9d49-5ee1808f5d89" containerName="mariadb-account-create-update" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.303930 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a70d715-2f49-4a60-9d49-5ee1808f5d89" containerName="mariadb-account-create-update" Mar 18 17:22:29 crc kubenswrapper[4939]: E0318 17:22:29.303958 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" containerName="mariadb-database-create" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.303964 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" containerName="mariadb-database-create" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.304138 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" containerName="mariadb-database-create" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.304156 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a70d715-2f49-4a60-9d49-5ee1808f5d89" containerName="mariadb-account-create-update" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.304845 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.306917 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.307383 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.307760 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5r5hh" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.307951 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.328016 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-f7rb6"] Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.372440 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.372573 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.372623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.372648 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkscr\" (UniqueName: \"kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.479050 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.479136 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.479190 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.479219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkscr\" (UniqueName: \"kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.486985 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.487468 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.498928 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.502995 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkscr\" (UniqueName: \"kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr\") pod \"aodh-db-sync-f7rb6\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:29 crc kubenswrapper[4939]: I0318 17:22:29.624539 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:30 crc kubenswrapper[4939]: I0318 17:22:30.128867 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-f7rb6"] Mar 18 17:22:30 crc kubenswrapper[4939]: I0318 17:22:30.449487 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f7rb6" event={"ID":"a068ef28-b3de-4a14-a8d2-3cf38f55864d","Type":"ContainerStarted","Data":"0bb2b9999f20953a3a37efccbf0f52e63e28cf03ff8f5df1b5c847c68c9f0ed2"} Mar 18 17:22:35 crc kubenswrapper[4939]: I0318 17:22:35.518616 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f7rb6" event={"ID":"a068ef28-b3de-4a14-a8d2-3cf38f55864d","Type":"ContainerStarted","Data":"9c2a599b733f3b5cb6a83cf83d76dc3970eee464e07d0bd322d2008b2036d9d0"} Mar 18 17:22:35 crc kubenswrapper[4939]: I0318 17:22:35.542814 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-f7rb6" podStartSLOduration=2.3020881380000002 podStartE2EDuration="6.542795369s" podCreationTimestamp="2026-03-18 17:22:29 +0000 UTC" firstStartedPulling="2026-03-18 17:22:30.138600729 +0000 UTC m=+6314.737788350" lastFinishedPulling="2026-03-18 17:22:34.37930796 +0000 UTC m=+6318.978495581" observedRunningTime="2026-03-18 17:22:35.540199205 +0000 UTC m=+6320.139386846" watchObservedRunningTime="2026-03-18 17:22:35.542795369 +0000 UTC m=+6320.141982990" Mar 18 17:22:37 crc kubenswrapper[4939]: I0318 17:22:37.549136 4939 generic.go:334] "Generic (PLEG): container finished" podID="a068ef28-b3de-4a14-a8d2-3cf38f55864d" containerID="9c2a599b733f3b5cb6a83cf83d76dc3970eee464e07d0bd322d2008b2036d9d0" exitCode=0 Mar 18 17:22:37 crc kubenswrapper[4939]: I0318 17:22:37.549266 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f7rb6" event={"ID":"a068ef28-b3de-4a14-a8d2-3cf38f55864d","Type":"ContainerDied","Data":"9c2a599b733f3b5cb6a83cf83d76dc3970eee464e07d0bd322d2008b2036d9d0"} Mar 18 17:22:38 crc kubenswrapper[4939]: I0318 17:22:38.971728 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.108910 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle\") pod \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.108973 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkscr\" (UniqueName: \"kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr\") pod \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.109018 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data\") pod \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.109100 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts\") pod \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\" (UID: \"a068ef28-b3de-4a14-a8d2-3cf38f55864d\") " Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.115123 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr" (OuterVolumeSpecName: "kube-api-access-zkscr") pod "a068ef28-b3de-4a14-a8d2-3cf38f55864d" (UID: "a068ef28-b3de-4a14-a8d2-3cf38f55864d"). InnerVolumeSpecName "kube-api-access-zkscr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.115449 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts" (OuterVolumeSpecName: "scripts") pod "a068ef28-b3de-4a14-a8d2-3cf38f55864d" (UID: "a068ef28-b3de-4a14-a8d2-3cf38f55864d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.141198 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a068ef28-b3de-4a14-a8d2-3cf38f55864d" (UID: "a068ef28-b3de-4a14-a8d2-3cf38f55864d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.143057 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data" (OuterVolumeSpecName: "config-data") pod "a068ef28-b3de-4a14-a8d2-3cf38f55864d" (UID: "a068ef28-b3de-4a14-a8d2-3cf38f55864d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.211598 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.211634 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkscr\" (UniqueName: \"kubernetes.io/projected/a068ef28-b3de-4a14-a8d2-3cf38f55864d-kube-api-access-zkscr\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.211649 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.211660 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a068ef28-b3de-4a14-a8d2-3cf38f55864d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.573747 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f7rb6" event={"ID":"a068ef28-b3de-4a14-a8d2-3cf38f55864d","Type":"ContainerDied","Data":"0bb2b9999f20953a3a37efccbf0f52e63e28cf03ff8f5df1b5c847c68c9f0ed2"} Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.573793 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb2b9999f20953a3a37efccbf0f52e63e28cf03ff8f5df1b5c847c68c9f0ed2" Mar 18 17:22:39 crc kubenswrapper[4939]: I0318 17:22:39.573795 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f7rb6" Mar 18 17:22:40 crc kubenswrapper[4939]: I0318 17:22:40.061726 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx4x4"] Mar 18 17:22:40 crc kubenswrapper[4939]: I0318 17:22:40.094500 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dx4x4"] Mar 18 17:22:40 crc kubenswrapper[4939]: I0318 17:22:40.145585 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2666dc14-55b4-4030-b812-73719a709183" path="/var/lib/kubelet/pods/2666dc14-55b4-4030-b812-73719a709183/volumes" Mar 18 17:22:42 crc kubenswrapper[4939]: I0318 17:22:42.647898 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.771080 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 17:22:43 crc kubenswrapper[4939]: E0318 17:22:43.771872 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a068ef28-b3de-4a14-a8d2-3cf38f55864d" containerName="aodh-db-sync" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.771888 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a068ef28-b3de-4a14-a8d2-3cf38f55864d" containerName="aodh-db-sync" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.772105 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a068ef28-b3de-4a14-a8d2-3cf38f55864d" containerName="aodh-db-sync" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.774094 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.777736 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.777960 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.784824 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5r5hh" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.797232 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.912192 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-config-data\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.912347 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzs2\" (UniqueName: \"kubernetes.io/projected/d425e944-b4cd-44e3-89e6-da39e5c92773-kube-api-access-hzzs2\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.912417 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:43 crc kubenswrapper[4939]: I0318 17:22:43.912549 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-scripts\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.014424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.014579 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-scripts\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.014627 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-config-data\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.014737 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzs2\" (UniqueName: \"kubernetes.io/projected/d425e944-b4cd-44e3-89e6-da39e5c92773-kube-api-access-hzzs2\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.034947 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.035343 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-config-data\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.035972 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d425e944-b4cd-44e3-89e6-da39e5c92773-scripts\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.037544 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzs2\" (UniqueName: \"kubernetes.io/projected/d425e944-b4cd-44e3-89e6-da39e5c92773-kube-api-access-hzzs2\") pod \"aodh-0\" (UID: \"d425e944-b4cd-44e3-89e6-da39e5c92773\") " pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.142693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 17:22:44 crc kubenswrapper[4939]: I0318 17:22:44.697955 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.609206 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.610894 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-central-agent" containerID="cri-o://e19c50e14be8de5a2b8fbe16341ef17af124aa88c0b95075f9d4409aeb58724f" gracePeriod=30 Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.610953 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-notification-agent" containerID="cri-o://c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1" gracePeriod=30 Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.610962 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="sg-core" containerID="cri-o://bb5f889e8ea6e524a10ae11df989e5f7084673177087b0ed111d5ed4d6dbcb22" gracePeriod=30 Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.610970 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="proxy-httpd" containerID="cri-o://f3b24225123852245e9fc843c729d2c6f0a523965c85219bd12b92cdee13e4ce" gracePeriod=30 Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.632722 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d425e944-b4cd-44e3-89e6-da39e5c92773","Type":"ContainerStarted","Data":"27348f82b60fe342fb08aab1640411f2b9cc140551e8a10cbc2db71d0d23ba0c"} Mar 18 17:22:45 crc kubenswrapper[4939]: I0318 17:22:45.632764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d425e944-b4cd-44e3-89e6-da39e5c92773","Type":"ContainerStarted","Data":"500ed842a96a86e522eb0f95376a4c60577abbaf486cb997e1f2cb9d07855004"} Mar 18 17:22:46 crc kubenswrapper[4939]: E0318 17:22:46.317356 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f03694_5c86_48d7_9e0b_3c37caa83041.slice/crio-conmon-c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653164 4939 generic.go:334] "Generic (PLEG): container finished" podID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerID="f3b24225123852245e9fc843c729d2c6f0a523965c85219bd12b92cdee13e4ce" exitCode=0 Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653479 4939 generic.go:334] "Generic (PLEG): container finished" podID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerID="bb5f889e8ea6e524a10ae11df989e5f7084673177087b0ed111d5ed4d6dbcb22" exitCode=2 Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653487 4939 generic.go:334] "Generic (PLEG): container finished" podID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerID="c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1" exitCode=0 Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653495 4939 generic.go:334] "Generic (PLEG): container finished" podID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerID="e19c50e14be8de5a2b8fbe16341ef17af124aa88c0b95075f9d4409aeb58724f" exitCode=0 Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653363 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerDied","Data":"f3b24225123852245e9fc843c729d2c6f0a523965c85219bd12b92cdee13e4ce"} Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653593 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerDied","Data":"bb5f889e8ea6e524a10ae11df989e5f7084673177087b0ed111d5ed4d6dbcb22"} Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653608 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerDied","Data":"c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1"} Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.653618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerDied","Data":"e19c50e14be8de5a2b8fbe16341ef17af124aa88c0b95075f9d4409aeb58724f"} Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.870546 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.975097 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.975957 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz4m5\" (UniqueName: \"kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.976128 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.976166 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.976241 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.976261 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.976286 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle\") pod \"d0f03694-5c86-48d7-9e0b-3c37caa83041\" (UID: \"d0f03694-5c86-48d7-9e0b-3c37caa83041\") " Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.977290 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.977388 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.980716 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts" (OuterVolumeSpecName: "scripts") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:46 crc kubenswrapper[4939]: I0318 17:22:46.980772 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5" (OuterVolumeSpecName: "kube-api-access-zz4m5") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "kube-api-access-zz4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.003919 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.051860 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079020 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079198 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079254 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079306 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f03694-5c86-48d7-9e0b-3c37caa83041-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079396 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.079458 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz4m5\" (UniqueName: \"kubernetes.io/projected/d0f03694-5c86-48d7-9e0b-3c37caa83041-kube-api-access-zz4m5\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.089524 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data" (OuterVolumeSpecName: "config-data") pod "d0f03694-5c86-48d7-9e0b-3c37caa83041" (UID: "d0f03694-5c86-48d7-9e0b-3c37caa83041"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.181677 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f03694-5c86-48d7-9e0b-3c37caa83041-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.667167 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d425e944-b4cd-44e3-89e6-da39e5c92773","Type":"ContainerStarted","Data":"9af29fc0f2efb1311a358dce291dc41f44410954f2897d77d0e7292290881057"} Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.672722 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0f03694-5c86-48d7-9e0b-3c37caa83041","Type":"ContainerDied","Data":"27cab990d5f54899f81502994e06e10ff3b9048aea5760825511ad31be3ab56f"} Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.672786 4939 scope.go:117] "RemoveContainer" containerID="f3b24225123852245e9fc843c729d2c6f0a523965c85219bd12b92cdee13e4ce" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.672852 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.736408 4939 scope.go:117] "RemoveContainer" containerID="bb5f889e8ea6e524a10ae11df989e5f7084673177087b0ed111d5ed4d6dbcb22" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.738337 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.755740 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.783432 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:47 crc kubenswrapper[4939]: E0318 17:22:47.784193 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="proxy-httpd" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.784326 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="proxy-httpd" Mar 18 17:22:47 crc kubenswrapper[4939]: E0318 17:22:47.784396 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-notification-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.787030 4939 scope.go:117] "RemoveContainer" containerID="c35b4e24ae32849f96e38a23c91b0d868cc1570b5c6b9a4b49b741f5356163a1" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.787094 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-notification-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: E0318 17:22:47.787400 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="sg-core" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.787412 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="sg-core" Mar 18 17:22:47 crc kubenswrapper[4939]: E0318 17:22:47.787590 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-central-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.787604 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-central-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.789711 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="sg-core" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.789773 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-central-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.789787 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="ceilometer-notification-agent" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.789797 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" containerName="proxy-httpd" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.792269 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.795827 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.797628 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.797713 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.828265 4939 scope.go:117] "RemoveContainer" containerID="e19c50e14be8de5a2b8fbe16341ef17af124aa88c0b95075f9d4409aeb58724f" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897005 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897302 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897352 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897444 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4258\" (UniqueName: \"kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897632 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.897768 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.898008 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999599 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999648 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999694 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4258\" (UniqueName: \"kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999749 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999784 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999840 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:47 crc kubenswrapper[4939]: I0318 17:22:47.999860 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.000360 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.001008 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.004682 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.005255 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.018187 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.018898 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.021252 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4258\" (UniqueName: \"kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258\") pod \"ceilometer-0\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.125250 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.143952 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f03694-5c86-48d7-9e0b-3c37caa83041" path="/var/lib/kubelet/pods/d0f03694-5c86-48d7-9e0b-3c37caa83041/volumes" Mar 18 17:22:48 crc kubenswrapper[4939]: W0318 17:22:48.622009 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4acc0d59_609a_41f1_a182_861ac1f13197.slice/crio-a2577130478fbab275e67ce85d319d7e6b6af984ab676f778b12624bd43af09f WatchSource:0}: Error finding container a2577130478fbab275e67ce85d319d7e6b6af984ab676f778b12624bd43af09f: Status 404 returned error can't find the container with id a2577130478fbab275e67ce85d319d7e6b6af984ab676f778b12624bd43af09f Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.632059 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:22:48 crc kubenswrapper[4939]: I0318 17:22:48.690296 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerStarted","Data":"a2577130478fbab275e67ce85d319d7e6b6af984ab676f778b12624bd43af09f"} Mar 18 17:22:49 crc kubenswrapper[4939]: I0318 17:22:49.705112 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerStarted","Data":"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989"} Mar 18 17:22:49 crc kubenswrapper[4939]: I0318 17:22:49.707540 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d425e944-b4cd-44e3-89e6-da39e5c92773","Type":"ContainerStarted","Data":"9c1b7cbc37e8f9650592c72eb3da5a70195480c2478718eef8f3a554ada8cde2"} Mar 18 17:22:51 crc kubenswrapper[4939]: I0318 17:22:51.730331 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerStarted","Data":"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8"} Mar 18 17:22:51 crc kubenswrapper[4939]: I0318 17:22:51.734050 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d425e944-b4cd-44e3-89e6-da39e5c92773","Type":"ContainerStarted","Data":"d4a8844cd6c5b25f48bba1f61e656daea4f1407c1ef267af1106034f0bcb0fb3"} Mar 18 17:22:51 crc kubenswrapper[4939]: I0318 17:22:51.752063 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.349119425 podStartE2EDuration="8.752040396s" podCreationTimestamp="2026-03-18 17:22:43 +0000 UTC" firstStartedPulling="2026-03-18 17:22:44.695069063 +0000 UTC m=+6329.294256684" lastFinishedPulling="2026-03-18 17:22:51.097990034 +0000 UTC m=+6335.697177655" observedRunningTime="2026-03-18 17:22:51.750353619 +0000 UTC m=+6336.349541240" watchObservedRunningTime="2026-03-18 17:22:51.752040396 +0000 UTC m=+6336.351228027" Mar 18 17:22:52 crc kubenswrapper[4939]: I0318 17:22:52.748622 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerStarted","Data":"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2"} Mar 18 17:22:54 crc kubenswrapper[4939]: I0318 17:22:54.778060 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerStarted","Data":"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e"} Mar 18 17:22:54 crc kubenswrapper[4939]: I0318 17:22:54.778830 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 17:22:54 crc kubenswrapper[4939]: I0318 17:22:54.811927 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.472141105 podStartE2EDuration="7.811860855s" podCreationTimestamp="2026-03-18 17:22:47 +0000 UTC" firstStartedPulling="2026-03-18 17:22:48.625987217 +0000 UTC m=+6333.225174848" lastFinishedPulling="2026-03-18 17:22:53.965706977 +0000 UTC m=+6338.564894598" observedRunningTime="2026-03-18 17:22:54.809888729 +0000 UTC m=+6339.409076380" watchObservedRunningTime="2026-03-18 17:22:54.811860855 +0000 UTC m=+6339.411048496" Mar 18 17:22:56 crc kubenswrapper[4939]: I0318 17:22:56.881390 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wtjcn"] Mar 18 17:22:56 crc kubenswrapper[4939]: I0318 17:22:56.883272 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:56 crc kubenswrapper[4939]: I0318 17:22:56.894919 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wtjcn"] Mar 18 17:22:56 crc kubenswrapper[4939]: I0318 17:22:56.922575 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:56 crc kubenswrapper[4939]: I0318 17:22:56.922679 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km54b\" (UniqueName: \"kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.024675 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.024799 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km54b\" (UniqueName: \"kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.025694 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.046211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km54b\" (UniqueName: \"kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b\") pod \"manila-db-create-wtjcn\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.080643 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-96ca-account-create-update-gxftp"] Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.082037 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.084747 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.094068 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-96ca-account-create-update-gxftp"] Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.126383 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fqb\" (UniqueName: \"kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.126880 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.211129 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wtjcn" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.229413 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fqb\" (UniqueName: \"kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.231029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.232447 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.249167 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fqb\" (UniqueName: \"kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb\") pod \"manila-96ca-account-create-update-gxftp\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.429171 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.761765 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wtjcn"] Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.811789 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wtjcn" event={"ID":"2dad903b-41e6-4f83-8e55-a44e1a891716","Type":"ContainerStarted","Data":"444c3bd2b31fbdd28b0feadb3bc5c669eb545a2bd75353b6d54fd64c583907f2"} Mar 18 17:22:57 crc kubenswrapper[4939]: I0318 17:22:57.930665 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-96ca-account-create-update-gxftp"] Mar 18 17:22:58 crc kubenswrapper[4939]: I0318 17:22:58.822394 4939 generic.go:334] "Generic (PLEG): container finished" podID="2d97d157-854d-423d-b4b2-bf11a05cf07b" containerID="9404162434f281a43876a4a06d6727b558578c242dca13e25a0eae60ce360e94" exitCode=0 Mar 18 17:22:58 crc kubenswrapper[4939]: I0318 17:22:58.822485 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-96ca-account-create-update-gxftp" event={"ID":"2d97d157-854d-423d-b4b2-bf11a05cf07b","Type":"ContainerDied","Data":"9404162434f281a43876a4a06d6727b558578c242dca13e25a0eae60ce360e94"} Mar 18 17:22:58 crc kubenswrapper[4939]: I0318 17:22:58.822943 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-96ca-account-create-update-gxftp" event={"ID":"2d97d157-854d-423d-b4b2-bf11a05cf07b","Type":"ContainerStarted","Data":"5883d747e2801314860e6f7180c0a22c43088d0d5d167d242ff7d00974d5629c"} Mar 18 17:22:58 crc kubenswrapper[4939]: I0318 17:22:58.825040 4939 generic.go:334] "Generic (PLEG): container finished" podID="2dad903b-41e6-4f83-8e55-a44e1a891716" containerID="1b9758729629f7b84aabbee7890c61d2186e75c3641b70640c0d717165352ce9" exitCode=0 Mar 18 17:22:58 crc kubenswrapper[4939]: I0318 17:22:58.825088 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wtjcn" event={"ID":"2dad903b-41e6-4f83-8e55-a44e1a891716","Type":"ContainerDied","Data":"1b9758729629f7b84aabbee7890c61d2186e75c3641b70640c0d717165352ce9"} Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.301163 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wtjcn" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.308921 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.402613 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fqb\" (UniqueName: \"kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb\") pod \"2d97d157-854d-423d-b4b2-bf11a05cf07b\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.402761 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km54b\" (UniqueName: \"kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b\") pod \"2dad903b-41e6-4f83-8e55-a44e1a891716\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.402800 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts\") pod \"2dad903b-41e6-4f83-8e55-a44e1a891716\" (UID: \"2dad903b-41e6-4f83-8e55-a44e1a891716\") " Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.402882 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts\") pod \"2d97d157-854d-423d-b4b2-bf11a05cf07b\" (UID: \"2d97d157-854d-423d-b4b2-bf11a05cf07b\") " Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.403733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dad903b-41e6-4f83-8e55-a44e1a891716" (UID: "2dad903b-41e6-4f83-8e55-a44e1a891716"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.403803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d97d157-854d-423d-b4b2-bf11a05cf07b" (UID: "2d97d157-854d-423d-b4b2-bf11a05cf07b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.408939 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb" (OuterVolumeSpecName: "kube-api-access-b5fqb") pod "2d97d157-854d-423d-b4b2-bf11a05cf07b" (UID: "2d97d157-854d-423d-b4b2-bf11a05cf07b"). InnerVolumeSpecName "kube-api-access-b5fqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.409923 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b" (OuterVolumeSpecName: "kube-api-access-km54b") pod "2dad903b-41e6-4f83-8e55-a44e1a891716" (UID: "2dad903b-41e6-4f83-8e55-a44e1a891716"). InnerVolumeSpecName "kube-api-access-km54b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.506250 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fqb\" (UniqueName: \"kubernetes.io/projected/2d97d157-854d-423d-b4b2-bf11a05cf07b-kube-api-access-b5fqb\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.506290 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km54b\" (UniqueName: \"kubernetes.io/projected/2dad903b-41e6-4f83-8e55-a44e1a891716-kube-api-access-km54b\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.506306 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dad903b-41e6-4f83-8e55-a44e1a891716-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.506318 4939 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d97d157-854d-423d-b4b2-bf11a05cf07b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.862723 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wtjcn" event={"ID":"2dad903b-41e6-4f83-8e55-a44e1a891716","Type":"ContainerDied","Data":"444c3bd2b31fbdd28b0feadb3bc5c669eb545a2bd75353b6d54fd64c583907f2"} Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.862773 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="444c3bd2b31fbdd28b0feadb3bc5c669eb545a2bd75353b6d54fd64c583907f2" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.862783 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wtjcn" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.866320 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-96ca-account-create-update-gxftp" event={"ID":"2d97d157-854d-423d-b4b2-bf11a05cf07b","Type":"ContainerDied","Data":"5883d747e2801314860e6f7180c0a22c43088d0d5d167d242ff7d00974d5629c"} Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.866370 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5883d747e2801314860e6f7180c0a22c43088d0d5d167d242ff7d00974d5629c" Mar 18 17:23:00 crc kubenswrapper[4939]: I0318 17:23:00.866385 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-96ca-account-create-update-gxftp" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.519897 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-cfzqj"] Mar 18 17:23:02 crc kubenswrapper[4939]: E0318 17:23:02.520518 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dad903b-41e6-4f83-8e55-a44e1a891716" containerName="mariadb-database-create" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.520531 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dad903b-41e6-4f83-8e55-a44e1a891716" containerName="mariadb-database-create" Mar 18 17:23:02 crc kubenswrapper[4939]: E0318 17:23:02.520559 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d97d157-854d-423d-b4b2-bf11a05cf07b" containerName="mariadb-account-create-update" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.520565 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d97d157-854d-423d-b4b2-bf11a05cf07b" containerName="mariadb-account-create-update" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.520747 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dad903b-41e6-4f83-8e55-a44e1a891716" containerName="mariadb-database-create" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.520773 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d97d157-854d-423d-b4b2-bf11a05cf07b" containerName="mariadb-account-create-update" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.521480 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.523685 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-jdwlx" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.526738 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.529230 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-cfzqj"] Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.651814 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.651877 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.651995 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.652061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t7ll\" (UniqueName: \"kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.754201 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.754528 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.754616 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.754668 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t7ll\" (UniqueName: \"kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.760351 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.760435 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.760580 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.772897 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t7ll\" (UniqueName: \"kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll\") pod \"manila-db-sync-cfzqj\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:02 crc kubenswrapper[4939]: I0318 17:23:02.839312 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:03 crc kubenswrapper[4939]: I0318 17:23:03.662047 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-cfzqj"] Mar 18 17:23:03 crc kubenswrapper[4939]: W0318 17:23:03.664189 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d98593_3990_4191_8867_efe09b71260f.slice/crio-1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03 WatchSource:0}: Error finding container 1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03: Status 404 returned error can't find the container with id 1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03 Mar 18 17:23:03 crc kubenswrapper[4939]: I0318 17:23:03.902824 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-cfzqj" event={"ID":"32d98593-3990-4191-8867-efe09b71260f","Type":"ContainerStarted","Data":"1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03"} Mar 18 17:23:06 crc kubenswrapper[4939]: I0318 17:23:06.140274 4939 scope.go:117] "RemoveContainer" containerID="a336e182c56f4324ff226b86a44102fbe74de992b38d27033ef017d86893abc1" Mar 18 17:23:07 crc kubenswrapper[4939]: I0318 17:23:07.631819 4939 scope.go:117] "RemoveContainer" containerID="0df02c559a9ab54350408f200e1bf646305f5e78160a741519d119f78760c2e0" Mar 18 17:23:07 crc kubenswrapper[4939]: I0318 17:23:07.680097 4939 scope.go:117] "RemoveContainer" containerID="f9dcf6d8169612a82e1c310e93b99decd9b0cac4754a45dc9383ff633825186e" Mar 18 17:23:07 crc kubenswrapper[4939]: I0318 17:23:07.799729 4939 scope.go:117] "RemoveContainer" containerID="3240d6a7a2fc78391cdae9f8fa68e721842316e23132fd163d046b6a5705d877" Mar 18 17:23:07 crc kubenswrapper[4939]: I0318 17:23:07.864865 4939 scope.go:117] "RemoveContainer" containerID="27f7b58c8666127dc76b1b59e6c5d59fbd23b611afad63942f31d9cbb2390a75" Mar 18 17:23:07 crc kubenswrapper[4939]: I0318 17:23:07.910585 4939 scope.go:117] "RemoveContainer" containerID="7e726726a3a493bf666f889f1c9adb50b600d62ed11907cef0fc2078254c172d" Mar 18 17:23:08 crc kubenswrapper[4939]: I0318 17:23:08.971752 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-cfzqj" event={"ID":"32d98593-3990-4191-8867-efe09b71260f","Type":"ContainerStarted","Data":"ba315b549a98ce5b0a248e4759218c94d1b62b221e80b94be11c15c1a0c65bd8"} Mar 18 17:23:08 crc kubenswrapper[4939]: I0318 17:23:08.999647 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-cfzqj" podStartSLOduration=2.985018217 podStartE2EDuration="6.999629699s" podCreationTimestamp="2026-03-18 17:23:02 +0000 UTC" firstStartedPulling="2026-03-18 17:23:03.666981979 +0000 UTC m=+6348.266169600" lastFinishedPulling="2026-03-18 17:23:07.681593461 +0000 UTC m=+6352.280781082" observedRunningTime="2026-03-18 17:23:08.995228894 +0000 UTC m=+6353.594416515" watchObservedRunningTime="2026-03-18 17:23:08.999629699 +0000 UTC m=+6353.598817320" Mar 18 17:23:10 crc kubenswrapper[4939]: I0318 17:23:10.988757 4939 generic.go:334] "Generic (PLEG): container finished" podID="32d98593-3990-4191-8867-efe09b71260f" containerID="ba315b549a98ce5b0a248e4759218c94d1b62b221e80b94be11c15c1a0c65bd8" exitCode=0 Mar 18 17:23:10 crc kubenswrapper[4939]: I0318 17:23:10.988880 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-cfzqj" event={"ID":"32d98593-3990-4191-8867-efe09b71260f","Type":"ContainerDied","Data":"ba315b549a98ce5b0a248e4759218c94d1b62b221e80b94be11c15c1a0c65bd8"} Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.533824 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.599787 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data\") pod \"32d98593-3990-4191-8867-efe09b71260f\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.599996 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t7ll\" (UniqueName: \"kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll\") pod \"32d98593-3990-4191-8867-efe09b71260f\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.600387 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data\") pod \"32d98593-3990-4191-8867-efe09b71260f\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.600573 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle\") pod \"32d98593-3990-4191-8867-efe09b71260f\" (UID: \"32d98593-3990-4191-8867-efe09b71260f\") " Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.605963 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll" (OuterVolumeSpecName: "kube-api-access-5t7ll") pod "32d98593-3990-4191-8867-efe09b71260f" (UID: "32d98593-3990-4191-8867-efe09b71260f"). InnerVolumeSpecName "kube-api-access-5t7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.606635 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "32d98593-3990-4191-8867-efe09b71260f" (UID: "32d98593-3990-4191-8867-efe09b71260f"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.609292 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data" (OuterVolumeSpecName: "config-data") pod "32d98593-3990-4191-8867-efe09b71260f" (UID: "32d98593-3990-4191-8867-efe09b71260f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.628726 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32d98593-3990-4191-8867-efe09b71260f" (UID: "32d98593-3990-4191-8867-efe09b71260f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.703300 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.703348 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.703367 4939 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32d98593-3990-4191-8867-efe09b71260f-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:12 crc kubenswrapper[4939]: I0318 17:23:12.703388 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t7ll\" (UniqueName: \"kubernetes.io/projected/32d98593-3990-4191-8867-efe09b71260f-kube-api-access-5t7ll\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.010835 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-cfzqj" event={"ID":"32d98593-3990-4191-8867-efe09b71260f","Type":"ContainerDied","Data":"1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03"} Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.011063 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c6e4f57bc3f20da9660f1b52b032d40ad75f7e8be4f258e5c6d2bb8a0f74b03" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.010990 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-cfzqj" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.359737 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: E0318 17:23:13.360220 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d98593-3990-4191-8867-efe09b71260f" containerName="manila-db-sync" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.360238 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d98593-3990-4191-8867-efe09b71260f" containerName="manila-db-sync" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.360455 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d98593-3990-4191-8867-efe09b71260f" containerName="manila-db-sync" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.365141 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.371750 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-jdwlx" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.371815 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.372013 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.372024 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.382685 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.404125 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.404240 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.410867 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418463 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/655b8ea9-b797-4292-a872-7c1db39b5dac-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418542 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-scripts\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418623 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418651 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418716 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418770 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418827 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6gc\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-kube-api-access-8v6gc\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418870 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-scripts\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418890 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418943 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjm5\" (UniqueName: \"kubernetes.io/projected/655b8ea9-b797-4292-a872-7c1db39b5dac-kube-api-access-8hjm5\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.418977 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.419047 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.419095 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-ceph\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.419114 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.429589 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.456328 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.459635 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.491974 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521033 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521102 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6gc\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-kube-api-access-8v6gc\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521139 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-scripts\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521160 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521194 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521205 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjm5\" (UniqueName: \"kubernetes.io/projected/655b8ea9-b797-4292-a872-7c1db39b5dac-kube-api-access-8hjm5\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521483 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521528 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521610 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521655 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521690 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521746 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-ceph\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521767 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521837 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/655b8ea9-b797-4292-a872-7c1db39b5dac-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521862 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521878 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-scripts\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521925 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.521942 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.522014 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8v9\" (UniqueName: \"kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.522039 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.524633 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/655b8ea9-b797-4292-a872-7c1db39b5dac-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.524817 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0d03e659-67b2-44e9-aebd-4fdaac7f806b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.528798 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.528805 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.529037 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.530004 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-ceph\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.530862 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-scripts\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.530912 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.533276 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-scripts\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.533767 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655b8ea9-b797-4292-a872-7c1db39b5dac-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.540199 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6gc\" (UniqueName: \"kubernetes.io/projected/0d03e659-67b2-44e9-aebd-4fdaac7f806b-kube-api-access-8v6gc\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.543954 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjm5\" (UniqueName: \"kubernetes.io/projected/655b8ea9-b797-4292-a872-7c1db39b5dac-kube-api-access-8hjm5\") pod \"manila-scheduler-0\" (UID: \"655b8ea9-b797-4292-a872-7c1db39b5dac\") " pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.547032 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d03e659-67b2-44e9-aebd-4fdaac7f806b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0d03e659-67b2-44e9-aebd-4fdaac7f806b\") " pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.624873 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.625467 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.625545 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.625636 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.625713 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8v9\" (UniqueName: \"kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.627405 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.630020 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.630661 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.631575 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.657580 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.659200 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.661983 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.666214 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8v9\" (UniqueName: \"kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9\") pod \"dnsmasq-dns-676f74989c-ckpzv\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.685837 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.711753 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.727848 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf99p\" (UniqueName: \"kubernetes.io/projected/ae01d133-287c-44ba-bdc8-76484241c92b-kube-api-access-xf99p\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.727910 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01d133-287c-44ba-bdc8-76484241c92b-logs\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.728025 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-scripts\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.728061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae01d133-287c-44ba-bdc8-76484241c92b-etc-machine-id\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.728078 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.728132 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data-custom\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.728156 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.740642 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.798426 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.829865 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data-custom\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.829905 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.829967 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf99p\" (UniqueName: \"kubernetes.io/projected/ae01d133-287c-44ba-bdc8-76484241c92b-kube-api-access-xf99p\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.830003 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01d133-287c-44ba-bdc8-76484241c92b-logs\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.830059 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-scripts\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.830209 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae01d133-287c-44ba-bdc8-76484241c92b-etc-machine-id\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.830228 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.832290 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01d133-287c-44ba-bdc8-76484241c92b-logs\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.832445 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae01d133-287c-44ba-bdc8-76484241c92b-etc-machine-id\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.835010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data-custom\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.836724 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.836891 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-config-data\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.842114 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae01d133-287c-44ba-bdc8-76484241c92b-scripts\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:13 crc kubenswrapper[4939]: I0318 17:23:13.857908 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf99p\" (UniqueName: \"kubernetes.io/projected/ae01d133-287c-44ba-bdc8-76484241c92b-kube-api-access-xf99p\") pod \"manila-api-0\" (UID: \"ae01d133-287c-44ba-bdc8-76484241c92b\") " pod="openstack/manila-api-0" Mar 18 17:23:14 crc kubenswrapper[4939]: I0318 17:23:14.012034 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 17:23:14 crc kubenswrapper[4939]: I0318 17:23:14.474213 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:23:14 crc kubenswrapper[4939]: I0318 17:23:14.491847 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 17:23:14 crc kubenswrapper[4939]: I0318 17:23:14.819079 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 17:23:14 crc kubenswrapper[4939]: W0318 17:23:14.840116 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d03e659_67b2_44e9_aebd_4fdaac7f806b.slice/crio-2307d9934ba27a37165ee38def618d0fb153e78d4eee66abbc282951c58d7ac0 WatchSource:0}: Error finding container 2307d9934ba27a37165ee38def618d0fb153e78d4eee66abbc282951c58d7ac0: Status 404 returned error can't find the container with id 2307d9934ba27a37165ee38def618d0fb153e78d4eee66abbc282951c58d7ac0 Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.018876 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.038321 4939 generic.go:334] "Generic (PLEG): container finished" podID="bc2f2aa7-6094-47fb-9504-08beffece194" containerID="dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121" exitCode=0 Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.038402 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" event={"ID":"bc2f2aa7-6094-47fb-9504-08beffece194","Type":"ContainerDied","Data":"dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121"} Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.038699 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" event={"ID":"bc2f2aa7-6094-47fb-9504-08beffece194","Type":"ContainerStarted","Data":"1904099cfe9ad3f87c0951ce5e9ffb17795b9ba77c56d0ebca9f6f7d56eaeb01"} Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.044921 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"655b8ea9-b797-4292-a872-7c1db39b5dac","Type":"ContainerStarted","Data":"c1a22028bb24122c771b2e4c01293717c90c88056152b2cfbc3a20d1579de3db"} Mar 18 17:23:15 crc kubenswrapper[4939]: I0318 17:23:15.047952 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0d03e659-67b2-44e9-aebd-4fdaac7f806b","Type":"ContainerStarted","Data":"2307d9934ba27a37165ee38def618d0fb153e78d4eee66abbc282951c58d7ac0"} Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.061911 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ae01d133-287c-44ba-bdc8-76484241c92b","Type":"ContainerStarted","Data":"98804583ec2857f0f79374f640717c218111896d9439aede52cbd8f185b0d30e"} Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.062313 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ae01d133-287c-44ba-bdc8-76484241c92b","Type":"ContainerStarted","Data":"63e900284441e1c832be0ed2f9fe58fe1285829c32e675c4de6a95d9c07f6022"} Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.067463 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" event={"ID":"bc2f2aa7-6094-47fb-9504-08beffece194","Type":"ContainerStarted","Data":"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f"} Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.067731 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.072133 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"655b8ea9-b797-4292-a872-7c1db39b5dac","Type":"ContainerStarted","Data":"24c2b4b6fb89f39ff23157553f9d0614ac5d7243ed7f1093d252351fa4caea17"} Mar 18 17:23:16 crc kubenswrapper[4939]: I0318 17:23:16.093109 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" podStartSLOduration=3.093092149 podStartE2EDuration="3.093092149s" podCreationTimestamp="2026-03-18 17:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:23:16.089361273 +0000 UTC m=+6360.688548894" watchObservedRunningTime="2026-03-18 17:23:16.093092149 +0000 UTC m=+6360.692279760" Mar 18 17:23:17 crc kubenswrapper[4939]: I0318 17:23:17.097962 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ae01d133-287c-44ba-bdc8-76484241c92b","Type":"ContainerStarted","Data":"0c6346409a8b6dae935cb7b6b768c6ae9a3c09cec59b0792f049f9046ca1a7dc"} Mar 18 17:23:17 crc kubenswrapper[4939]: I0318 17:23:17.098683 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 17:23:17 crc kubenswrapper[4939]: I0318 17:23:17.101761 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"655b8ea9-b797-4292-a872-7c1db39b5dac","Type":"ContainerStarted","Data":"c53ea1f75e96eb7c293056f01abbdd5f23d4195819f96f782acd8589b4ea9f1e"} Mar 18 17:23:17 crc kubenswrapper[4939]: I0318 17:23:17.133036 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.133013 podStartE2EDuration="4.133013s" podCreationTimestamp="2026-03-18 17:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:23:17.119690011 +0000 UTC m=+6361.718877642" watchObservedRunningTime="2026-03-18 17:23:17.133013 +0000 UTC m=+6361.732200621" Mar 18 17:23:17 crc kubenswrapper[4939]: I0318 17:23:17.204077 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.567520082 podStartE2EDuration="4.204043777s" podCreationTimestamp="2026-03-18 17:23:13 +0000 UTC" firstStartedPulling="2026-03-18 17:23:14.504069766 +0000 UTC m=+6359.103257387" lastFinishedPulling="2026-03-18 17:23:15.140593461 +0000 UTC m=+6359.739781082" observedRunningTime="2026-03-18 17:23:17.176945947 +0000 UTC m=+6361.776133578" watchObservedRunningTime="2026-03-18 17:23:17.204043777 +0000 UTC m=+6361.803231408" Mar 18 17:23:18 crc kubenswrapper[4939]: I0318 17:23:18.146665 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 17:23:22 crc kubenswrapper[4939]: I0318 17:23:22.156744 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0d03e659-67b2-44e9-aebd-4fdaac7f806b","Type":"ContainerStarted","Data":"83b7cbf20988c87a7fc405f977b27582b8de80fd0da3cea6414cc821fb63ff4b"} Mar 18 17:23:22 crc kubenswrapper[4939]: I0318 17:23:22.157222 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0d03e659-67b2-44e9-aebd-4fdaac7f806b","Type":"ContainerStarted","Data":"7799b9905e6213b5cdf5b14682e0680883a921dcc8995a5026ff5a0bc793f61d"} Mar 18 17:23:22 crc kubenswrapper[4939]: I0318 17:23:22.193250 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.110464661 podStartE2EDuration="9.193224821s" podCreationTimestamp="2026-03-18 17:23:13 +0000 UTC" firstStartedPulling="2026-03-18 17:23:14.847672983 +0000 UTC m=+6359.446860604" lastFinishedPulling="2026-03-18 17:23:20.930433133 +0000 UTC m=+6365.529620764" observedRunningTime="2026-03-18 17:23:22.179192913 +0000 UTC m=+6366.778380534" watchObservedRunningTime="2026-03-18 17:23:22.193224821 +0000 UTC m=+6366.792412442" Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.687154 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.687454 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.711981 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.741919 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.800725 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.894191 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:23:23 crc kubenswrapper[4939]: I0318 17:23:23.894401 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="dnsmasq-dns" containerID="cri-o://90970a2b79bba1907a80f9ebf7afc08ed943ddc14e5ac38ae20895156e686ea8" gracePeriod=10 Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.050530 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xtc45"] Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.062868 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3146-account-create-update-4r9mf"] Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.071619 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xtc45"] Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.086032 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3146-account-create-update-4r9mf"] Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.156352 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7d0043-4f48-4076-a59b-eaaa2bf6006e" path="/var/lib/kubelet/pods/ba7d0043-4f48-4076-a59b-eaaa2bf6006e/volumes" Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.158912 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68a3156-e5ae-4a0f-94b3-93c16bde7cca" path="/var/lib/kubelet/pods/f68a3156-e5ae-4a0f-94b3-93c16bde7cca/volumes" Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.180329 4939 generic.go:334] "Generic (PLEG): container finished" podID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerID="90970a2b79bba1907a80f9ebf7afc08ed943ddc14e5ac38ae20895156e686ea8" exitCode=0 Mar 18 17:23:24 crc kubenswrapper[4939]: I0318 17:23:24.180687 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" event={"ID":"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430","Type":"ContainerDied","Data":"90970a2b79bba1907a80f9ebf7afc08ed943ddc14e5ac38ae20895156e686ea8"} Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.078172 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.192775 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" event={"ID":"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430","Type":"ContainerDied","Data":"94e6d1935c84bfda1a36aecca5de8566c5389a6029bc43206500c56d35e56f73"} Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.193041 4939 scope.go:117] "RemoveContainer" containerID="90970a2b79bba1907a80f9ebf7afc08ed943ddc14e5ac38ae20895156e686ea8" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.192852 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-bgw4m" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.244122 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.244201 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ddfm\" (UniqueName: \"kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.244342 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.244379 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.244433 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.345186 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm" (OuterVolumeSpecName: "kube-api-access-6ddfm") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430"). InnerVolumeSpecName "kube-api-access-6ddfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.361070 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ddfm\" (UniqueName: \"kubernetes.io/projected/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-kube-api-access-6ddfm\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.366922 4939 scope.go:117] "RemoveContainer" containerID="e99f588261f57c808a19354a45cb0b37dc480d3b7d3686ca738acd776f7504a1" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.391171 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.404738 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:25 crc kubenswrapper[4939]: E0318 17:23:25.409809 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config podName:ca60e1d5-7c89-4ffa-8a7b-1e86847f5430 nodeName:}" failed. No retries permitted until 2026-03-18 17:23:25.90977692 +0000 UTC m=+6370.508964541 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430") : error deleting /var/lib/kubelet/pods/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430/volume-subpaths: remove /var/lib/kubelet/pods/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430/volume-subpaths: no such file or directory Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.410068 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.462897 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.462925 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.462934 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.972831 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") pod \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\" (UID: \"ca60e1d5-7c89-4ffa-8a7b-1e86847f5430\") " Mar 18 17:23:25 crc kubenswrapper[4939]: I0318 17:23:25.976921 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config" (OuterVolumeSpecName: "config") pod "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" (UID: "ca60e1d5-7c89-4ffa-8a7b-1e86847f5430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.078562 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.126407 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.149989 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-bgw4m"] Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.770456 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.770801 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-central-agent" containerID="cri-o://e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989" gracePeriod=30 Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.770861 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="proxy-httpd" containerID="cri-o://e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e" gracePeriod=30 Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.770940 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-notification-agent" containerID="cri-o://725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8" gracePeriod=30 Mar 18 17:23:26 crc kubenswrapper[4939]: I0318 17:23:26.770916 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="sg-core" containerID="cri-o://fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2" gracePeriod=30 Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227202 4939 generic.go:334] "Generic (PLEG): container finished" podID="4acc0d59-609a-41f1-a182-861ac1f13197" containerID="e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e" exitCode=0 Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227492 4939 generic.go:334] "Generic (PLEG): container finished" podID="4acc0d59-609a-41f1-a182-861ac1f13197" containerID="fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2" exitCode=2 Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227517 4939 generic.go:334] "Generic (PLEG): container finished" podID="4acc0d59-609a-41f1-a182-861ac1f13197" containerID="e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989" exitCode=0 Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227551 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerDied","Data":"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e"} Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227575 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerDied","Data":"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2"} Mar 18 17:23:27 crc kubenswrapper[4939]: I0318 17:23:27.227585 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerDied","Data":"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989"} Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.114416 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.146144 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" path="/var/lib/kubelet/pods/ca60e1d5-7c89-4ffa-8a7b-1e86847f5430/volumes" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219067 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219133 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219212 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219231 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219279 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.219605 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.220110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4258\" (UniqueName: \"kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258\") pod \"4acc0d59-609a-41f1-a182-861ac1f13197\" (UID: \"4acc0d59-609a-41f1-a182-861ac1f13197\") " Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.220120 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.220689 4939 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.220708 4939 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4acc0d59-609a-41f1-a182-861ac1f13197-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.226655 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts" (OuterVolumeSpecName: "scripts") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.233850 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258" (OuterVolumeSpecName: "kube-api-access-z4258") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "kube-api-access-z4258". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.250448 4939 generic.go:334] "Generic (PLEG): container finished" podID="4acc0d59-609a-41f1-a182-861ac1f13197" containerID="725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8" exitCode=0 Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.250586 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.250556 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerDied","Data":"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8"} Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.250936 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4acc0d59-609a-41f1-a182-861ac1f13197","Type":"ContainerDied","Data":"a2577130478fbab275e67ce85d319d7e6b6af984ab676f778b12624bd43af09f"} Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.250965 4939 scope.go:117] "RemoveContainer" containerID="e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.274690 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.316713 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.322686 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4258\" (UniqueName: \"kubernetes.io/projected/4acc0d59-609a-41f1-a182-861ac1f13197-kube-api-access-z4258\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.325039 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.325068 4939 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.325078 4939 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.344360 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data" (OuterVolumeSpecName: "config-data") pod "4acc0d59-609a-41f1-a182-861ac1f13197" (UID: "4acc0d59-609a-41f1-a182-861ac1f13197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.383753 4939 scope.go:117] "RemoveContainer" containerID="fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.406106 4939 scope.go:117] "RemoveContainer" containerID="725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.423969 4939 scope.go:117] "RemoveContainer" containerID="e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.425728 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4acc0d59-609a-41f1-a182-861ac1f13197-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.451712 4939 scope.go:117] "RemoveContainer" containerID="e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.452905 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e\": container with ID starting with e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e not found: ID does not exist" containerID="e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.452961 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e"} err="failed to get container status \"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e\": rpc error: code = NotFound desc = could not find container \"e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e\": container with ID starting with e6593f66fbc8fa9612baf0123e645dfbc9c070a36321a198fceaf687064e632e not found: ID does not exist" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.452993 4939 scope.go:117] "RemoveContainer" containerID="fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.453468 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2\": container with ID starting with fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2 not found: ID does not exist" containerID="fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.453498 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2"} err="failed to get container status \"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2\": rpc error: code = NotFound desc = could not find container \"fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2\": container with ID starting with fda7254e70e377c37dca4979a87a93d74531d2d03aa438c5f4a3b88e1c0e13a2 not found: ID does not exist" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.453592 4939 scope.go:117] "RemoveContainer" containerID="725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.454142 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8\": container with ID starting with 725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8 not found: ID does not exist" containerID="725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.454165 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8"} err="failed to get container status \"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8\": rpc error: code = NotFound desc = could not find container \"725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8\": container with ID starting with 725422e6d401a8c9e67f2ed095915c9b0357c02cffc43fe6615270b22ecb11e8 not found: ID does not exist" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.454192 4939 scope.go:117] "RemoveContainer" containerID="e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.454443 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989\": container with ID starting with e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989 not found: ID does not exist" containerID="e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.454475 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989"} err="failed to get container status \"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989\": rpc error: code = NotFound desc = could not find container \"e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989\": container with ID starting with e726b7a5323432e491a93e8036820af368000f5a566ec231c131d83987109989 not found: ID does not exist" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.584482 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.596986 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.627683 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628291 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-notification-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628319 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-notification-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628378 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="dnsmasq-dns" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628389 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="dnsmasq-dns" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628404 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="sg-core" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628412 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="sg-core" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628431 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="proxy-httpd" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628441 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="proxy-httpd" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628456 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="init" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628464 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="init" Mar 18 17:23:28 crc kubenswrapper[4939]: E0318 17:23:28.628496 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-central-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628524 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-central-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628777 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="proxy-httpd" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628798 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-central-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628809 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="sg-core" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628836 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" containerName="ceilometer-notification-agent" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.628850 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca60e1d5-7c89-4ffa-8a7b-1e86847f5430" containerName="dnsmasq-dns" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.631306 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.634226 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.634234 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.649196 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.837653 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-scripts\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.837753 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-config-data\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.838761 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.838922 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.839031 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.839142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.839293 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p627h\" (UniqueName: \"kubernetes.io/projected/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-kube-api-access-p627h\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942069 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p627h\" (UniqueName: \"kubernetes.io/projected/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-kube-api-access-p627h\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942543 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-scripts\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942688 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-config-data\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942800 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942910 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.942999 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.943093 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.943893 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.943934 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.947719 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.947878 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-scripts\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.948111 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-config-data\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.948584 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:28 crc kubenswrapper[4939]: I0318 17:23:28.965774 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p627h\" (UniqueName: \"kubernetes.io/projected/1ca0fea8-0244-4bff-b58f-6073f0ab19cc-kube-api-access-p627h\") pod \"ceilometer-0\" (UID: \"1ca0fea8-0244-4bff-b58f-6073f0ab19cc\") " pod="openstack/ceilometer-0" Mar 18 17:23:29 crc kubenswrapper[4939]: I0318 17:23:29.262226 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 17:23:29 crc kubenswrapper[4939]: W0318 17:23:29.775018 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca0fea8_0244_4bff_b58f_6073f0ab19cc.slice/crio-8143f1b502239adb692da2e3f7233565553db141459263d8479102d66c7898bf WatchSource:0}: Error finding container 8143f1b502239adb692da2e3f7233565553db141459263d8479102d66c7898bf: Status 404 returned error can't find the container with id 8143f1b502239adb692da2e3f7233565553db141459263d8479102d66c7898bf Mar 18 17:23:29 crc kubenswrapper[4939]: I0318 17:23:29.775919 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 17:23:30 crc kubenswrapper[4939]: I0318 17:23:30.144481 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4acc0d59-609a-41f1-a182-861ac1f13197" path="/var/lib/kubelet/pods/4acc0d59-609a-41f1-a182-861ac1f13197/volumes" Mar 18 17:23:30 crc kubenswrapper[4939]: I0318 17:23:30.271726 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca0fea8-0244-4bff-b58f-6073f0ab19cc","Type":"ContainerStarted","Data":"8143f1b502239adb692da2e3f7233565553db141459263d8479102d66c7898bf"} Mar 18 17:23:32 crc kubenswrapper[4939]: I0318 17:23:32.032312 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-q6dxn"] Mar 18 17:23:32 crc kubenswrapper[4939]: I0318 17:23:32.047235 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-q6dxn"] Mar 18 17:23:32 crc kubenswrapper[4939]: I0318 17:23:32.146099 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f97cae-5340-48f6-9a5c-663d16f1746b" path="/var/lib/kubelet/pods/e7f97cae-5340-48f6-9a5c-663d16f1746b/volumes" Mar 18 17:23:32 crc kubenswrapper[4939]: I0318 17:23:32.290821 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca0fea8-0244-4bff-b58f-6073f0ab19cc","Type":"ContainerStarted","Data":"759fa98102bb05479d950603db35750f2383e3f2aa0c0870a329a51fb1a804fc"} Mar 18 17:23:33 crc kubenswrapper[4939]: I0318 17:23:33.304587 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca0fea8-0244-4bff-b58f-6073f0ab19cc","Type":"ContainerStarted","Data":"40b817491496bbbfb74c584e91354b3f9be13d996f58f78a327902ed50f64d02"} Mar 18 17:23:34 crc kubenswrapper[4939]: I0318 17:23:34.315291 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca0fea8-0244-4bff-b58f-6073f0ab19cc","Type":"ContainerStarted","Data":"5ee3f3ce4954d98234c89c4673c68b0a7ada84ed288778f45d01db6a911594f3"} Mar 18 17:23:35 crc kubenswrapper[4939]: I0318 17:23:35.413673 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 17:23:35 crc kubenswrapper[4939]: I0318 17:23:35.552457 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 18 17:23:35 crc kubenswrapper[4939]: I0318 17:23:35.733440 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 17:23:37 crc kubenswrapper[4939]: I0318 17:23:37.382816 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca0fea8-0244-4bff-b58f-6073f0ab19cc","Type":"ContainerStarted","Data":"dcee2283ed52a4b23b06da2d46cb102666804a5508b00c6a5790e0e9df2530e9"} Mar 18 17:23:37 crc kubenswrapper[4939]: I0318 17:23:37.383329 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 17:23:37 crc kubenswrapper[4939]: I0318 17:23:37.402906 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.623190494 podStartE2EDuration="9.402888324s" podCreationTimestamp="2026-03-18 17:23:28 +0000 UTC" firstStartedPulling="2026-03-18 17:23:29.778635151 +0000 UTC m=+6374.377822762" lastFinishedPulling="2026-03-18 17:23:36.558332971 +0000 UTC m=+6381.157520592" observedRunningTime="2026-03-18 17:23:37.401255667 +0000 UTC m=+6382.000443288" watchObservedRunningTime="2026-03-18 17:23:37.402888324 +0000 UTC m=+6382.002075945" Mar 18 17:23:53 crc kubenswrapper[4939]: I0318 17:23:53.687467 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:23:53 crc kubenswrapper[4939]: I0318 17:23:53.688355 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:23:59 crc kubenswrapper[4939]: I0318 17:23:59.267643 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.147084 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564244-kgbhm"] Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.148866 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.150634 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.151098 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.151682 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.155791 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-kgbhm"] Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.164967 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjr6\" (UniqueName: \"kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6\") pod \"auto-csr-approver-29564244-kgbhm\" (UID: \"89c090d7-9f74-40fa-8ec5-e0364fce49ae\") " pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.266716 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjr6\" (UniqueName: \"kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6\") pod \"auto-csr-approver-29564244-kgbhm\" (UID: \"89c090d7-9f74-40fa-8ec5-e0364fce49ae\") " pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.302406 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjr6\" (UniqueName: \"kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6\") pod \"auto-csr-approver-29564244-kgbhm\" (UID: \"89c090d7-9f74-40fa-8ec5-e0364fce49ae\") " pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.474663 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:00 crc kubenswrapper[4939]: I0318 17:24:00.966397 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-kgbhm"] Mar 18 17:24:01 crc kubenswrapper[4939]: I0318 17:24:01.632285 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" event={"ID":"89c090d7-9f74-40fa-8ec5-e0364fce49ae","Type":"ContainerStarted","Data":"90df7f90d7d3c45658f8c5f245937c1dfa8c5926bd7a6b1f572662f0e4b937dc"} Mar 18 17:24:02 crc kubenswrapper[4939]: I0318 17:24:02.641906 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" event={"ID":"89c090d7-9f74-40fa-8ec5-e0364fce49ae","Type":"ContainerStarted","Data":"fa0c1678f09a5f9b844e59b9283456e7a6118df5773990a4ec5f89f1859508d0"} Mar 18 17:24:02 crc kubenswrapper[4939]: I0318 17:24:02.658517 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" podStartSLOduration=1.631550206 podStartE2EDuration="2.658490568s" podCreationTimestamp="2026-03-18 17:24:00 +0000 UTC" firstStartedPulling="2026-03-18 17:24:00.965168253 +0000 UTC m=+6405.564355874" lastFinishedPulling="2026-03-18 17:24:01.992108615 +0000 UTC m=+6406.591296236" observedRunningTime="2026-03-18 17:24:02.654123454 +0000 UTC m=+6407.253311095" watchObservedRunningTime="2026-03-18 17:24:02.658490568 +0000 UTC m=+6407.257678189" Mar 18 17:24:04 crc kubenswrapper[4939]: I0318 17:24:04.663805 4939 generic.go:334] "Generic (PLEG): container finished" podID="89c090d7-9f74-40fa-8ec5-e0364fce49ae" containerID="fa0c1678f09a5f9b844e59b9283456e7a6118df5773990a4ec5f89f1859508d0" exitCode=0 Mar 18 17:24:04 crc kubenswrapper[4939]: I0318 17:24:04.663933 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" event={"ID":"89c090d7-9f74-40fa-8ec5-e0364fce49ae","Type":"ContainerDied","Data":"fa0c1678f09a5f9b844e59b9283456e7a6118df5773990a4ec5f89f1859508d0"} Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.046348 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.061635 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctjr6\" (UniqueName: \"kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6\") pod \"89c090d7-9f74-40fa-8ec5-e0364fce49ae\" (UID: \"89c090d7-9f74-40fa-8ec5-e0364fce49ae\") " Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.347032 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6" (OuterVolumeSpecName: "kube-api-access-ctjr6") pod "89c090d7-9f74-40fa-8ec5-e0364fce49ae" (UID: "89c090d7-9f74-40fa-8ec5-e0364fce49ae"). InnerVolumeSpecName "kube-api-access-ctjr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.350101 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctjr6\" (UniqueName: \"kubernetes.io/projected/89c090d7-9f74-40fa-8ec5-e0364fce49ae-kube-api-access-ctjr6\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.695628 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" event={"ID":"89c090d7-9f74-40fa-8ec5-e0364fce49ae","Type":"ContainerDied","Data":"90df7f90d7d3c45658f8c5f245937c1dfa8c5926bd7a6b1f572662f0e4b937dc"} Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.695987 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90df7f90d7d3c45658f8c5f245937c1dfa8c5926bd7a6b1f572662f0e4b937dc" Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.695739 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-kgbhm" Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.744013 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-5t7v4"] Mar 18 17:24:06 crc kubenswrapper[4939]: I0318 17:24:06.753589 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-5t7v4"] Mar 18 17:24:08 crc kubenswrapper[4939]: I0318 17:24:08.085243 4939 scope.go:117] "RemoveContainer" containerID="1bfb17ad01275f9d0be2fbc08c33ce556e0d5c10b5c83ec2f38e1350cefc5665" Mar 18 17:24:08 crc kubenswrapper[4939]: I0318 17:24:08.113724 4939 scope.go:117] "RemoveContainer" containerID="ae6ad071f58af9dd704f774ce0a135cfeeda0778afb73956f6d111665323604e" Mar 18 17:24:08 crc kubenswrapper[4939]: I0318 17:24:08.146632 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a51ae8-f644-40f8-9c42-75150f336923" path="/var/lib/kubelet/pods/d4a51ae8-f644-40f8-9c42-75150f336923/volumes" Mar 18 17:24:08 crc kubenswrapper[4939]: I0318 17:24:08.165656 4939 scope.go:117] "RemoveContainer" containerID="7923719ff9f60dcf746a168fb84140e39f7215ef92dc9c77fafc58aac43b54a9" Mar 18 17:24:08 crc kubenswrapper[4939]: I0318 17:24:08.249772 4939 scope.go:117] "RemoveContainer" containerID="a8acbdce59ccfaf1cbfb2e990b6c05577de7978407cc275a74502a1afb8379a3" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.702042 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:15 crc kubenswrapper[4939]: E0318 17:24:15.703026 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c090d7-9f74-40fa-8ec5-e0364fce49ae" containerName="oc" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.703044 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c090d7-9f74-40fa-8ec5-e0364fce49ae" containerName="oc" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.703270 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c090d7-9f74-40fa-8ec5-e0364fce49ae" containerName="oc" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.704442 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.707978 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.713600 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.845663 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.845701 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.846134 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745gc\" (UniqueName: \"kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.846224 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.846307 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.846360 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948491 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948647 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948678 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948779 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745gc\" (UniqueName: \"kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948803 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.948825 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.949567 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.949608 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.949784 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.949912 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.949912 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:15 crc kubenswrapper[4939]: I0318 17:24:15.967586 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745gc\" (UniqueName: \"kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc\") pod \"dnsmasq-dns-7467c5dd9-k7jrt\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:16 crc kubenswrapper[4939]: I0318 17:24:16.023592 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:16 crc kubenswrapper[4939]: W0318 17:24:16.560909 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccbdd278_7129_4cfb_8e8a_319670502a60.slice/crio-f51bea5ab9543bd5519d61b0001cde220a09658ae9bd975419f424330a1d0ac9 WatchSource:0}: Error finding container f51bea5ab9543bd5519d61b0001cde220a09658ae9bd975419f424330a1d0ac9: Status 404 returned error can't find the container with id f51bea5ab9543bd5519d61b0001cde220a09658ae9bd975419f424330a1d0ac9 Mar 18 17:24:16 crc kubenswrapper[4939]: I0318 17:24:16.564449 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:16 crc kubenswrapper[4939]: I0318 17:24:16.823591 4939 generic.go:334] "Generic (PLEG): container finished" podID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerID="8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707" exitCode=0 Mar 18 17:24:16 crc kubenswrapper[4939]: I0318 17:24:16.823705 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" event={"ID":"ccbdd278-7129-4cfb-8e8a-319670502a60","Type":"ContainerDied","Data":"8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707"} Mar 18 17:24:16 crc kubenswrapper[4939]: I0318 17:24:16.824648 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" event={"ID":"ccbdd278-7129-4cfb-8e8a-319670502a60","Type":"ContainerStarted","Data":"f51bea5ab9543bd5519d61b0001cde220a09658ae9bd975419f424330a1d0ac9"} Mar 18 17:24:17 crc kubenswrapper[4939]: I0318 17:24:17.844309 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" event={"ID":"ccbdd278-7129-4cfb-8e8a-319670502a60","Type":"ContainerStarted","Data":"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59"} Mar 18 17:24:17 crc kubenswrapper[4939]: I0318 17:24:17.845629 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:17 crc kubenswrapper[4939]: I0318 17:24:17.869278 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" podStartSLOduration=2.869258692 podStartE2EDuration="2.869258692s" podCreationTimestamp="2026-03-18 17:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:24:17.865157276 +0000 UTC m=+6422.464344907" watchObservedRunningTime="2026-03-18 17:24:17.869258692 +0000 UTC m=+6422.468446313" Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.687595 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.688103 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.688151 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.689055 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.689139 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3" gracePeriod=600 Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.904013 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3" exitCode=0 Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.904109 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3"} Mar 18 17:24:23 crc kubenswrapper[4939]: I0318 17:24:23.904390 4939 scope.go:117] "RemoveContainer" containerID="5f0d466308dd942f53e7201cd4d211cae7e61d08eb4599e0dd7c1594e2a1ce4e" Mar 18 17:24:24 crc kubenswrapper[4939]: I0318 17:24:24.917207 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115"} Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.025710 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.099728 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.099960 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="dnsmasq-dns" containerID="cri-o://c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f" gracePeriod=10 Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.283184 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-4pml7"] Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.285217 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.357276 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-4pml7"] Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396235 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396631 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396702 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-config\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396797 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248bf\" (UniqueName: \"kubernetes.io/projected/6071a760-6039-4785-b7a9-dc01f0558392-kube-api-access-248bf\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.396829 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499129 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-config\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499188 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499274 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248bf\" (UniqueName: \"kubernetes.io/projected/6071a760-6039-4785-b7a9-dc01f0558392-kube-api-access-248bf\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499310 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499384 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.499421 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.500245 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-config\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.500289 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.500365 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.500863 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.501348 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6071a760-6039-4785-b7a9-dc01f0558392-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.539426 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248bf\" (UniqueName: \"kubernetes.io/projected/6071a760-6039-4785-b7a9-dc01f0558392-kube-api-access-248bf\") pod \"dnsmasq-dns-5cdb84b55c-4pml7\" (UID: \"6071a760-6039-4785-b7a9-dc01f0558392\") " pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.620105 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.777624 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.907096 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.907179 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8v9\" (UniqueName: \"kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.907239 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.907398 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.907449 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.920107 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9" (OuterVolumeSpecName: "kube-api-access-9v8v9") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "kube-api-access-9v8v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.941009 4939 generic.go:334] "Generic (PLEG): container finished" podID="bc2f2aa7-6094-47fb-9504-08beffece194" containerID="c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f" exitCode=0 Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.941091 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" event={"ID":"bc2f2aa7-6094-47fb-9504-08beffece194","Type":"ContainerDied","Data":"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f"} Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.941126 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" event={"ID":"bc2f2aa7-6094-47fb-9504-08beffece194","Type":"ContainerDied","Data":"1904099cfe9ad3f87c0951ce5e9ffb17795b9ba77c56d0ebca9f6f7d56eaeb01"} Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.941162 4939 scope.go:117] "RemoveContainer" containerID="c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.941284 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-ckpzv" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.975729 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.981317 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:26 crc kubenswrapper[4939]: I0318 17:24:26.991711 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.000668 4939 scope.go:117] "RemoveContainer" containerID="dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.010736 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config" (OuterVolumeSpecName: "config") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012157 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") pod \"bc2f2aa7-6094-47fb-9504-08beffece194\" (UID: \"bc2f2aa7-6094-47fb-9504-08beffece194\") " Mar 18 17:24:27 crc kubenswrapper[4939]: W0318 17:24:27.012347 4939 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bc2f2aa7-6094-47fb-9504-08beffece194/volumes/kubernetes.io~configmap/config Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012375 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config" (OuterVolumeSpecName: "config") pod "bc2f2aa7-6094-47fb-9504-08beffece194" (UID: "bc2f2aa7-6094-47fb-9504-08beffece194"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012913 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012932 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012943 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012951 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc2f2aa7-6094-47fb-9504-08beffece194-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.012961 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8v9\" (UniqueName: \"kubernetes.io/projected/bc2f2aa7-6094-47fb-9504-08beffece194-kube-api-access-9v8v9\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.032090 4939 scope.go:117] "RemoveContainer" containerID="c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f" Mar 18 17:24:27 crc kubenswrapper[4939]: E0318 17:24:27.032539 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f\": container with ID starting with c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f not found: ID does not exist" containerID="c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.032591 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f"} err="failed to get container status \"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f\": rpc error: code = NotFound desc = could not find container \"c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f\": container with ID starting with c9873271991c41bb4dbf82200cf77a7e599d0599f22c46e7944d7732d347263f not found: ID does not exist" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.032622 4939 scope.go:117] "RemoveContainer" containerID="dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121" Mar 18 17:24:27 crc kubenswrapper[4939]: E0318 17:24:27.032898 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121\": container with ID starting with dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121 not found: ID does not exist" containerID="dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.032950 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121"} err="failed to get container status \"dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121\": rpc error: code = NotFound desc = could not find container \"dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121\": container with ID starting with dab84ee42cff207a4265fc214bdb2c5dbc6cd4055364e70aedbaae8a62594121 not found: ID does not exist" Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.190036 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-4pml7"] Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.375148 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.384173 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-ckpzv"] Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.952742 4939 generic.go:334] "Generic (PLEG): container finished" podID="6071a760-6039-4785-b7a9-dc01f0558392" containerID="27f52bd033a1a3fc8aec13d77510db15232a78c62d7f08154b0600f5485d821e" exitCode=0 Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.952821 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" event={"ID":"6071a760-6039-4785-b7a9-dc01f0558392","Type":"ContainerDied","Data":"27f52bd033a1a3fc8aec13d77510db15232a78c62d7f08154b0600f5485d821e"} Mar 18 17:24:27 crc kubenswrapper[4939]: I0318 17:24:27.953174 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" event={"ID":"6071a760-6039-4785-b7a9-dc01f0558392","Type":"ContainerStarted","Data":"0c5d553dc1d348c0408705b4e52a8b9b87561f96c05ff1f533c29fd4edcf6ed4"} Mar 18 17:24:28 crc kubenswrapper[4939]: I0318 17:24:28.150973 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" path="/var/lib/kubelet/pods/bc2f2aa7-6094-47fb-9504-08beffece194/volumes" Mar 18 17:24:28 crc kubenswrapper[4939]: I0318 17:24:28.966383 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" event={"ID":"6071a760-6039-4785-b7a9-dc01f0558392","Type":"ContainerStarted","Data":"ba30d199540852bba566d14407d0f0e0975fb5375964db12e797f895183a37c7"} Mar 18 17:24:28 crc kubenswrapper[4939]: I0318 17:24:28.966982 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:28 crc kubenswrapper[4939]: I0318 17:24:28.987276 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" podStartSLOduration=2.9872586759999997 podStartE2EDuration="2.987258676s" podCreationTimestamp="2026-03-18 17:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:24:28.983218892 +0000 UTC m=+6433.582406523" watchObservedRunningTime="2026-03-18 17:24:28.987258676 +0000 UTC m=+6433.586446297" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.438057 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz"] Mar 18 17:24:32 crc kubenswrapper[4939]: E0318 17:24:32.438952 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="dnsmasq-dns" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.438965 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="dnsmasq-dns" Mar 18 17:24:32 crc kubenswrapper[4939]: E0318 17:24:32.438995 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="init" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.439001 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="init" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.439194 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2f2aa7-6094-47fb-9504-08beffece194" containerName="dnsmasq-dns" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.440035 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.447009 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.447166 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.447250 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.447350 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.452379 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz"] Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.543718 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.543759 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkr9d\" (UniqueName: \"kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.543805 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.544341 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.544490 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.646743 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.646794 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkr9d\" (UniqueName: \"kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.646847 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.646974 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.647029 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.653287 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.654312 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.670232 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.670551 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.674813 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkr9d\" (UniqueName: \"kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:32 crc kubenswrapper[4939]: I0318 17:24:32.765319 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:24:33 crc kubenswrapper[4939]: I0318 17:24:33.316491 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz"] Mar 18 17:24:33 crc kubenswrapper[4939]: W0318 17:24:33.341276 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aca3e76_93f3_4cee_9e08_63e1953f7e90.slice/crio-74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc WatchSource:0}: Error finding container 74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc: Status 404 returned error can't find the container with id 74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc Mar 18 17:24:34 crc kubenswrapper[4939]: I0318 17:24:34.011289 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" event={"ID":"0aca3e76-93f3-4cee-9e08-63e1953f7e90","Type":"ContainerStarted","Data":"74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc"} Mar 18 17:24:36 crc kubenswrapper[4939]: I0318 17:24:36.622714 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cdb84b55c-4pml7" Mar 18 17:24:36 crc kubenswrapper[4939]: I0318 17:24:36.684867 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:36 crc kubenswrapper[4939]: I0318 17:24:36.685149 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="dnsmasq-dns" containerID="cri-o://5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59" gracePeriod=10 Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.765034 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.878320 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.878356 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.878417 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.878439 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.879218 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-745gc\" (UniqueName: \"kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.879311 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb\") pod \"ccbdd278-7129-4cfb-8e8a-319670502a60\" (UID: \"ccbdd278-7129-4cfb-8e8a-319670502a60\") " Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.887000 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc" (OuterVolumeSpecName: "kube-api-access-745gc") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "kube-api-access-745gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.934610 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config" (OuterVolumeSpecName: "config") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.936535 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.939048 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.954422 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.968689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccbdd278-7129-4cfb-8e8a-319670502a60" (UID: "ccbdd278-7129-4cfb-8e8a-319670502a60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982070 4939 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982104 4939 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982118 4939 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982130 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982168 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-745gc\" (UniqueName: \"kubernetes.io/projected/ccbdd278-7129-4cfb-8e8a-319670502a60-kube-api-access-745gc\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:37 crc kubenswrapper[4939]: I0318 17:24:37.982180 4939 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccbdd278-7129-4cfb-8e8a-319670502a60-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.067025 4939 generic.go:334] "Generic (PLEG): container finished" podID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerID="5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59" exitCode=0 Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.067069 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" event={"ID":"ccbdd278-7129-4cfb-8e8a-319670502a60","Type":"ContainerDied","Data":"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59"} Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.067086 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.067098 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-k7jrt" event={"ID":"ccbdd278-7129-4cfb-8e8a-319670502a60","Type":"ContainerDied","Data":"f51bea5ab9543bd5519d61b0001cde220a09658ae9bd975419f424330a1d0ac9"} Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.067122 4939 scope.go:117] "RemoveContainer" containerID="5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.117070 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.122544 4939 scope.go:117] "RemoveContainer" containerID="8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.129098 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-k7jrt"] Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.150365 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" path="/var/lib/kubelet/pods/ccbdd278-7129-4cfb-8e8a-319670502a60/volumes" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.152888 4939 scope.go:117] "RemoveContainer" containerID="5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59" Mar 18 17:24:38 crc kubenswrapper[4939]: E0318 17:24:38.153655 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59\": container with ID starting with 5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59 not found: ID does not exist" containerID="5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.153710 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59"} err="failed to get container status \"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59\": rpc error: code = NotFound desc = could not find container \"5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59\": container with ID starting with 5ddb61db35c56a95abeb888d00224c1365d8f4cc054377c2a1673042ae849a59 not found: ID does not exist" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.153742 4939 scope.go:117] "RemoveContainer" containerID="8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707" Mar 18 17:24:38 crc kubenswrapper[4939]: E0318 17:24:38.154439 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707\": container with ID starting with 8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707 not found: ID does not exist" containerID="8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707" Mar 18 17:24:38 crc kubenswrapper[4939]: I0318 17:24:38.154473 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707"} err="failed to get container status \"8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707\": rpc error: code = NotFound desc = could not find container \"8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707\": container with ID starting with 8815f8fdcd62ea94af9d404fe48de86ce2e49e69f3af6c3f8878b96a283c9707 not found: ID does not exist" Mar 18 17:24:50 crc kubenswrapper[4939]: I0318 17:24:50.209217 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" event={"ID":"0aca3e76-93f3-4cee-9e08-63e1953f7e90","Type":"ContainerStarted","Data":"65b44ad95374dcdc8119126addcc6efc7c110f9d5aa6e003bce7ed0d0365db0c"} Mar 18 17:24:51 crc kubenswrapper[4939]: I0318 17:24:51.279213 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" podStartSLOduration=2.872351381 podStartE2EDuration="19.279197629s" podCreationTimestamp="2026-03-18 17:24:32 +0000 UTC" firstStartedPulling="2026-03-18 17:24:33.344404164 +0000 UTC m=+6437.943591785" lastFinishedPulling="2026-03-18 17:24:49.751250412 +0000 UTC m=+6454.350438033" observedRunningTime="2026-03-18 17:24:51.269832653 +0000 UTC m=+6455.869020274" watchObservedRunningTime="2026-03-18 17:24:51.279197629 +0000 UTC m=+6455.878385250" Mar 18 17:24:58 crc kubenswrapper[4939]: I0318 17:24:58.219922 4939 patch_prober.go:28] interesting pod/route-controller-manager-57f7fc8d4c-jsn6b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 17:24:58 crc kubenswrapper[4939]: I0318 17:24:58.220628 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" podUID="b474ab64-fb2c-4a3c-be42-0a2c888e60e5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 17:24:58 crc kubenswrapper[4939]: I0318 17:24:58.223124 4939 patch_prober.go:28] interesting pod/route-controller-manager-57f7fc8d4c-jsn6b container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded" start-of-body= Mar 18 17:24:58 crc kubenswrapper[4939]: I0318 17:24:58.223189 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-57f7fc8d4c-jsn6b" podUID="b474ab64-fb2c-4a3c-be42-0a2c888e60e5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded" Mar 18 17:25:08 crc kubenswrapper[4939]: I0318 17:25:08.411097 4939 generic.go:334] "Generic (PLEG): container finished" podID="0aca3e76-93f3-4cee-9e08-63e1953f7e90" containerID="65b44ad95374dcdc8119126addcc6efc7c110f9d5aa6e003bce7ed0d0365db0c" exitCode=0 Mar 18 17:25:08 crc kubenswrapper[4939]: I0318 17:25:08.411166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" event={"ID":"0aca3e76-93f3-4cee-9e08-63e1953f7e90","Type":"ContainerDied","Data":"65b44ad95374dcdc8119126addcc6efc7c110f9d5aa6e003bce7ed0d0365db0c"} Mar 18 17:25:09 crc kubenswrapper[4939]: I0318 17:25:09.865179 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:09.999957 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1\") pod \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.000054 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph\") pod \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.000225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory\") pod \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.000316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle\") pod \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.000375 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkr9d\" (UniqueName: \"kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d\") pod \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\" (UID: \"0aca3e76-93f3-4cee-9e08-63e1953f7e90\") " Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.007449 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "0aca3e76-93f3-4cee-9e08-63e1953f7e90" (UID: "0aca3e76-93f3-4cee-9e08-63e1953f7e90"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.007583 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d" (OuterVolumeSpecName: "kube-api-access-tkr9d") pod "0aca3e76-93f3-4cee-9e08-63e1953f7e90" (UID: "0aca3e76-93f3-4cee-9e08-63e1953f7e90"). InnerVolumeSpecName "kube-api-access-tkr9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.009684 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph" (OuterVolumeSpecName: "ceph") pod "0aca3e76-93f3-4cee-9e08-63e1953f7e90" (UID: "0aca3e76-93f3-4cee-9e08-63e1953f7e90"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.044538 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0aca3e76-93f3-4cee-9e08-63e1953f7e90" (UID: "0aca3e76-93f3-4cee-9e08-63e1953f7e90"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.063989 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory" (OuterVolumeSpecName: "inventory") pod "0aca3e76-93f3-4cee-9e08-63e1953f7e90" (UID: "0aca3e76-93f3-4cee-9e08-63e1953f7e90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.103909 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.103956 4939 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.103978 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkr9d\" (UniqueName: \"kubernetes.io/projected/0aca3e76-93f3-4cee-9e08-63e1953f7e90-kube-api-access-tkr9d\") on node \"crc\" DevicePath \"\"" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.103998 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.104015 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aca3e76-93f3-4cee-9e08-63e1953f7e90-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.433890 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" event={"ID":"0aca3e76-93f3-4cee-9e08-63e1953f7e90","Type":"ContainerDied","Data":"74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc"} Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.434219 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74171b1ed48a6bbad8669697bf7702bb8b944e9ef8b369cab21b76f01d1a01cc" Mar 18 17:25:10 crc kubenswrapper[4939]: I0318 17:25:10.433947 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.543318 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5"] Mar 18 17:25:22 crc kubenswrapper[4939]: E0318 17:25:22.544403 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aca3e76-93f3-4cee-9e08-63e1953f7e90" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.544422 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aca3e76-93f3-4cee-9e08-63e1953f7e90" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 17:25:22 crc kubenswrapper[4939]: E0318 17:25:22.544438 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="dnsmasq-dns" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.544446 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="dnsmasq-dns" Mar 18 17:25:22 crc kubenswrapper[4939]: E0318 17:25:22.544462 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="init" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.544470 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="init" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.544746 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbdd278-7129-4cfb-8e8a-319670502a60" containerName="dnsmasq-dns" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.544777 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aca3e76-93f3-4cee-9e08-63e1953f7e90" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.545790 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.547789 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.548245 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.548456 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.548790 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.569717 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5"] Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.575799 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwvv4\" (UniqueName: \"kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.575842 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.575862 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.576020 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.576288 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.677680 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.677756 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwvv4\" (UniqueName: \"kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.677784 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.677802 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.677888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.685302 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.685489 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.685686 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.686085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.702846 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwvv4\" (UniqueName: \"kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:22 crc kubenswrapper[4939]: I0318 17:25:22.871735 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:25:23 crc kubenswrapper[4939]: I0318 17:25:23.425405 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5"] Mar 18 17:25:23 crc kubenswrapper[4939]: I0318 17:25:23.561946 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" event={"ID":"e8465fa8-f589-4135-b6bf-f278436d5326","Type":"ContainerStarted","Data":"f885f9f25ed34c95c98179feb793375d3536bb1b55bb99bd8dff33a21f4f36fb"} Mar 18 17:25:24 crc kubenswrapper[4939]: I0318 17:25:24.573560 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" event={"ID":"e8465fa8-f589-4135-b6bf-f278436d5326","Type":"ContainerStarted","Data":"7f2ddba3e031ebc985b7a6b065a6688162bf3c2f0dded49bf2fd55469dec0e1f"} Mar 18 17:25:24 crc kubenswrapper[4939]: I0318 17:25:24.593346 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" podStartSLOduration=2.135030854 podStartE2EDuration="2.593326428s" podCreationTimestamp="2026-03-18 17:25:22 +0000 UTC" firstStartedPulling="2026-03-18 17:25:23.413987958 +0000 UTC m=+6488.013175579" lastFinishedPulling="2026-03-18 17:25:23.872283532 +0000 UTC m=+6488.471471153" observedRunningTime="2026-03-18 17:25:24.590052965 +0000 UTC m=+6489.189240586" watchObservedRunningTime="2026-03-18 17:25:24.593326428 +0000 UTC m=+6489.192514069" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.150699 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564246-jrjv8"] Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.155137 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564246-jrjv8"] Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.155289 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.157762 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.157893 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.162279 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.357700 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zks4p\" (UniqueName: \"kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p\") pod \"auto-csr-approver-29564246-jrjv8\" (UID: \"6b1aa391-1e5e-4b89-a2ec-85a986c66474\") " pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.459549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zks4p\" (UniqueName: \"kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p\") pod \"auto-csr-approver-29564246-jrjv8\" (UID: \"6b1aa391-1e5e-4b89-a2ec-85a986c66474\") " pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.488926 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zks4p\" (UniqueName: \"kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p\") pod \"auto-csr-approver-29564246-jrjv8\" (UID: \"6b1aa391-1e5e-4b89-a2ec-85a986c66474\") " pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:00 crc kubenswrapper[4939]: I0318 17:26:00.789425 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:01 crc kubenswrapper[4939]: I0318 17:26:01.276734 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564246-jrjv8"] Mar 18 17:26:01 crc kubenswrapper[4939]: I0318 17:26:01.301434 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" event={"ID":"6b1aa391-1e5e-4b89-a2ec-85a986c66474","Type":"ContainerStarted","Data":"9fd999e563fd2f45d0dd419760a679ec0145dd3b1a8db7cf2cbdb216c18f1245"} Mar 18 17:26:03 crc kubenswrapper[4939]: I0318 17:26:03.319737 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" event={"ID":"6b1aa391-1e5e-4b89-a2ec-85a986c66474","Type":"ContainerStarted","Data":"4a3da4c6bd09504c037f462a528942cc5c5f73b7f29be700adf4cbbb06794a93"} Mar 18 17:26:03 crc kubenswrapper[4939]: I0318 17:26:03.339810 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" podStartSLOduration=2.078845497 podStartE2EDuration="3.339793414s" podCreationTimestamp="2026-03-18 17:26:00 +0000 UTC" firstStartedPulling="2026-03-18 17:26:01.28198079 +0000 UTC m=+6525.881168411" lastFinishedPulling="2026-03-18 17:26:02.542928707 +0000 UTC m=+6527.142116328" observedRunningTime="2026-03-18 17:26:03.332896999 +0000 UTC m=+6527.932084640" watchObservedRunningTime="2026-03-18 17:26:03.339793414 +0000 UTC m=+6527.938981035" Mar 18 17:26:04 crc kubenswrapper[4939]: I0318 17:26:04.329001 4939 generic.go:334] "Generic (PLEG): container finished" podID="6b1aa391-1e5e-4b89-a2ec-85a986c66474" containerID="4a3da4c6bd09504c037f462a528942cc5c5f73b7f29be700adf4cbbb06794a93" exitCode=0 Mar 18 17:26:04 crc kubenswrapper[4939]: I0318 17:26:04.329097 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" event={"ID":"6b1aa391-1e5e-4b89-a2ec-85a986c66474","Type":"ContainerDied","Data":"4a3da4c6bd09504c037f462a528942cc5c5f73b7f29be700adf4cbbb06794a93"} Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.799616 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.901609 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:05 crc kubenswrapper[4939]: E0318 17:26:05.902074 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1aa391-1e5e-4b89-a2ec-85a986c66474" containerName="oc" Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.902092 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1aa391-1e5e-4b89-a2ec-85a986c66474" containerName="oc" Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.902596 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1aa391-1e5e-4b89-a2ec-85a986c66474" containerName="oc" Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.904159 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.911191 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.991250 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zks4p\" (UniqueName: \"kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p\") pod \"6b1aa391-1e5e-4b89-a2ec-85a986c66474\" (UID: \"6b1aa391-1e5e-4b89-a2ec-85a986c66474\") " Mar 18 17:26:05 crc kubenswrapper[4939]: I0318 17:26:05.997788 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p" (OuterVolumeSpecName: "kube-api-access-zks4p") pod "6b1aa391-1e5e-4b89-a2ec-85a986c66474" (UID: "6b1aa391-1e5e-4b89-a2ec-85a986c66474"). InnerVolumeSpecName "kube-api-access-zks4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.094651 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.094708 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.094750 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5rc\" (UniqueName: \"kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.094829 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zks4p\" (UniqueName: \"kubernetes.io/projected/6b1aa391-1e5e-4b89-a2ec-85a986c66474-kube-api-access-zks4p\") on node \"crc\" DevicePath \"\"" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.196546 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.196611 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.196773 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5rc\" (UniqueName: \"kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.197269 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.197288 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.213781 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5rc\" (UniqueName: \"kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc\") pod \"redhat-marketplace-mwr7c\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.240013 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.381950 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" event={"ID":"6b1aa391-1e5e-4b89-a2ec-85a986c66474","Type":"ContainerDied","Data":"9fd999e563fd2f45d0dd419760a679ec0145dd3b1a8db7cf2cbdb216c18f1245"} Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.382258 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd999e563fd2f45d0dd419760a679ec0145dd3b1a8db7cf2cbdb216c18f1245" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.382475 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564246-jrjv8" Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.446003 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-vwblv"] Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.459050 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-vwblv"] Mar 18 17:26:06 crc kubenswrapper[4939]: W0318 17:26:06.748529 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a61ce10_ef9d_4472_8c6c_587a0f7df0d8.slice/crio-31bf28575721522459416ee4b7675acda36aaeee7cce0a447f0e5c39f54bc505 WatchSource:0}: Error finding container 31bf28575721522459416ee4b7675acda36aaeee7cce0a447f0e5c39f54bc505: Status 404 returned error can't find the container with id 31bf28575721522459416ee4b7675acda36aaeee7cce0a447f0e5c39f54bc505 Mar 18 17:26:06 crc kubenswrapper[4939]: I0318 17:26:06.753982 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:07 crc kubenswrapper[4939]: I0318 17:26:07.392117 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerID="58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a" exitCode=0 Mar 18 17:26:07 crc kubenswrapper[4939]: I0318 17:26:07.392229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerDied","Data":"58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a"} Mar 18 17:26:07 crc kubenswrapper[4939]: I0318 17:26:07.392441 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerStarted","Data":"31bf28575721522459416ee4b7675acda36aaeee7cce0a447f0e5c39f54bc505"} Mar 18 17:26:08 crc kubenswrapper[4939]: I0318 17:26:08.151058 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473055c8-6c3b-4f69-bac4-51661019986c" path="/var/lib/kubelet/pods/473055c8-6c3b-4f69-bac4-51661019986c/volumes" Mar 18 17:26:08 crc kubenswrapper[4939]: I0318 17:26:08.407594 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerStarted","Data":"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6"} Mar 18 17:26:08 crc kubenswrapper[4939]: I0318 17:26:08.505590 4939 scope.go:117] "RemoveContainer" containerID="f58d00469d2c3254d3adf80a8137a02f2ae26cfb452fea7f2f7f608af908ab80" Mar 18 17:26:09 crc kubenswrapper[4939]: I0318 17:26:09.418513 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerID="c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6" exitCode=0 Mar 18 17:26:09 crc kubenswrapper[4939]: I0318 17:26:09.418634 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerDied","Data":"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6"} Mar 18 17:26:10 crc kubenswrapper[4939]: I0318 17:26:10.430928 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerStarted","Data":"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381"} Mar 18 17:26:10 crc kubenswrapper[4939]: I0318 17:26:10.449396 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mwr7c" podStartSLOduration=2.676826552 podStartE2EDuration="5.449374873s" podCreationTimestamp="2026-03-18 17:26:05 +0000 UTC" firstStartedPulling="2026-03-18 17:26:07.394109094 +0000 UTC m=+6531.993296715" lastFinishedPulling="2026-03-18 17:26:10.166657415 +0000 UTC m=+6534.765845036" observedRunningTime="2026-03-18 17:26:10.448262212 +0000 UTC m=+6535.047449843" watchObservedRunningTime="2026-03-18 17:26:10.449374873 +0000 UTC m=+6535.048562514" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.036156 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-5qrb9"] Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.049057 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-5qrb9"] Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.149915 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d192b635-1876-423f-83a9-68bc9ab9ba98" path="/var/lib/kubelet/pods/d192b635-1876-423f-83a9-68bc9ab9ba98/volumes" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.240639 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.241024 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.304443 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.540229 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:16 crc kubenswrapper[4939]: I0318 17:26:16.597137 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:18 crc kubenswrapper[4939]: I0318 17:26:18.039331 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-b5fd-account-create-update-wr4hv"] Mar 18 17:26:18 crc kubenswrapper[4939]: I0318 17:26:18.048497 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-b5fd-account-create-update-wr4hv"] Mar 18 17:26:18 crc kubenswrapper[4939]: I0318 17:26:18.153938 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97a0adc-34fd-4fbc-aa41-56d51e7eb170" path="/var/lib/kubelet/pods/d97a0adc-34fd-4fbc-aa41-56d51e7eb170/volumes" Mar 18 17:26:18 crc kubenswrapper[4939]: I0318 17:26:18.504798 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mwr7c" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="registry-server" containerID="cri-o://75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381" gracePeriod=2 Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.031268 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.179718 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content\") pod \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.179849 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities\") pod \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.180019 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5rc\" (UniqueName: \"kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc\") pod \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\" (UID: \"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8\") " Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.180366 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities" (OuterVolumeSpecName: "utilities") pod "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" (UID: "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.180719 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.188762 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc" (OuterVolumeSpecName: "kube-api-access-pd5rc") pod "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" (UID: "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8"). InnerVolumeSpecName "kube-api-access-pd5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.207206 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" (UID: "5a61ce10-ef9d-4472-8c6c-587a0f7df0d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.282725 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.282761 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd5rc\" (UniqueName: \"kubernetes.io/projected/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8-kube-api-access-pd5rc\") on node \"crc\" DevicePath \"\"" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.518448 4939 generic.go:334] "Generic (PLEG): container finished" podID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerID="75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381" exitCode=0 Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.518538 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerDied","Data":"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381"} Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.518606 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwr7c" event={"ID":"5a61ce10-ef9d-4472-8c6c-587a0f7df0d8","Type":"ContainerDied","Data":"31bf28575721522459416ee4b7675acda36aaeee7cce0a447f0e5c39f54bc505"} Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.518631 4939 scope.go:117] "RemoveContainer" containerID="75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.518778 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwr7c" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.543748 4939 scope.go:117] "RemoveContainer" containerID="c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.571333 4939 scope.go:117] "RemoveContainer" containerID="58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.591036 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.603545 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwr7c"] Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.618584 4939 scope.go:117] "RemoveContainer" containerID="75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381" Mar 18 17:26:19 crc kubenswrapper[4939]: E0318 17:26:19.619200 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381\": container with ID starting with 75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381 not found: ID does not exist" containerID="75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.619245 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381"} err="failed to get container status \"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381\": rpc error: code = NotFound desc = could not find container \"75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381\": container with ID starting with 75029071d76103be171220eef6701ca3d5274a77ce9eb3cdbd508def743a1381 not found: ID does not exist" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.619276 4939 scope.go:117] "RemoveContainer" containerID="c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6" Mar 18 17:26:19 crc kubenswrapper[4939]: E0318 17:26:19.619538 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6\": container with ID starting with c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6 not found: ID does not exist" containerID="c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.619618 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6"} err="failed to get container status \"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6\": rpc error: code = NotFound desc = could not find container \"c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6\": container with ID starting with c75c22d9166e2a07f6af1b3d2d6aaad803bcb1e63198c508083239d60c27c7e6 not found: ID does not exist" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.619681 4939 scope.go:117] "RemoveContainer" containerID="58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a" Mar 18 17:26:19 crc kubenswrapper[4939]: E0318 17:26:19.620017 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a\": container with ID starting with 58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a not found: ID does not exist" containerID="58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a" Mar 18 17:26:19 crc kubenswrapper[4939]: I0318 17:26:19.620123 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a"} err="failed to get container status \"58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a\": rpc error: code = NotFound desc = could not find container \"58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a\": container with ID starting with 58972ca68ece98e22140c64a8afe0962670e55650264fe8d06c901f791608f1a not found: ID does not exist" Mar 18 17:26:20 crc kubenswrapper[4939]: I0318 17:26:20.148790 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" path="/var/lib/kubelet/pods/5a61ce10-ef9d-4472-8c6c-587a0f7df0d8/volumes" Mar 18 17:26:24 crc kubenswrapper[4939]: I0318 17:26:24.037852 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-dw9cc"] Mar 18 17:26:24 crc kubenswrapper[4939]: I0318 17:26:24.048189 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-dw9cc"] Mar 18 17:26:24 crc kubenswrapper[4939]: I0318 17:26:24.146657 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c056622-aeea-4e82-87d3-e147e22f1fc5" path="/var/lib/kubelet/pods/8c056622-aeea-4e82-87d3-e147e22f1fc5/volumes" Mar 18 17:26:25 crc kubenswrapper[4939]: I0318 17:26:25.037449 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-41f8-account-create-update-7btrr"] Mar 18 17:26:25 crc kubenswrapper[4939]: I0318 17:26:25.052252 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-41f8-account-create-update-7btrr"] Mar 18 17:26:26 crc kubenswrapper[4939]: I0318 17:26:26.148624 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b4007c-f7d3-47ad-9e5d-9b12886416f2" path="/var/lib/kubelet/pods/62b4007c-f7d3-47ad-9e5d-9b12886416f2/volumes" Mar 18 17:26:53 crc kubenswrapper[4939]: I0318 17:26:53.687425 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:26:53 crc kubenswrapper[4939]: I0318 17:26:53.688008 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:26:57 crc kubenswrapper[4939]: I0318 17:26:57.061191 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-nzmc9"] Mar 18 17:26:57 crc kubenswrapper[4939]: I0318 17:26:57.069358 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-nzmc9"] Mar 18 17:26:58 crc kubenswrapper[4939]: I0318 17:26:58.146148 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21184d1f-a09e-4dc0-82d7-ae468b35ea5d" path="/var/lib/kubelet/pods/21184d1f-a09e-4dc0-82d7-ae468b35ea5d/volumes" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.969225 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:26:59 crc kubenswrapper[4939]: E0318 17:26:59.971032 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="extract-content" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.971151 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="extract-content" Mar 18 17:26:59 crc kubenswrapper[4939]: E0318 17:26:59.971260 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="registry-server" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.971342 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="registry-server" Mar 18 17:26:59 crc kubenswrapper[4939]: E0318 17:26:59.971428 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="extract-utilities" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.971495 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="extract-utilities" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.971895 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a61ce10-ef9d-4472-8c6c-587a0f7df0d8" containerName="registry-server" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.974132 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:26:59 crc kubenswrapper[4939]: I0318 17:26:59.995031 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.012008 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.012359 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6lw\" (UniqueName: \"kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.012575 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.114576 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6lw\" (UniqueName: \"kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.115039 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.115322 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.115548 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.115940 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.135453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6lw\" (UniqueName: \"kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw\") pod \"certified-operators-d65q2\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.305433 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.811653 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:27:00 crc kubenswrapper[4939]: I0318 17:27:00.974730 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerStarted","Data":"7fea455592d584faee5667e76a5ae1e2131f26cbd6e91469f8cf08805a0e7102"} Mar 18 17:27:01 crc kubenswrapper[4939]: I0318 17:27:01.985147 4939 generic.go:334] "Generic (PLEG): container finished" podID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerID="0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06" exitCode=0 Mar 18 17:27:01 crc kubenswrapper[4939]: I0318 17:27:01.985225 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerDied","Data":"0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06"} Mar 18 17:27:02 crc kubenswrapper[4939]: I0318 17:27:02.997649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerStarted","Data":"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae"} Mar 18 17:27:05 crc kubenswrapper[4939]: I0318 17:27:05.034304 4939 generic.go:334] "Generic (PLEG): container finished" podID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerID="7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae" exitCode=0 Mar 18 17:27:05 crc kubenswrapper[4939]: I0318 17:27:05.034409 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerDied","Data":"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae"} Mar 18 17:27:07 crc kubenswrapper[4939]: I0318 17:27:07.064547 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerStarted","Data":"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589"} Mar 18 17:27:07 crc kubenswrapper[4939]: I0318 17:27:07.087854 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d65q2" podStartSLOduration=4.247651544 podStartE2EDuration="8.087830022s" podCreationTimestamp="2026-03-18 17:26:59 +0000 UTC" firstStartedPulling="2026-03-18 17:27:01.987917532 +0000 UTC m=+6586.587105153" lastFinishedPulling="2026-03-18 17:27:05.82809599 +0000 UTC m=+6590.427283631" observedRunningTime="2026-03-18 17:27:07.087224464 +0000 UTC m=+6591.686412085" watchObservedRunningTime="2026-03-18 17:27:07.087830022 +0000 UTC m=+6591.687017643" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.608416 4939 scope.go:117] "RemoveContainer" containerID="df067ebb9b8e271ad4d1c9add53af1f6839408ee92221a122ce31fa46294e217" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.650229 4939 scope.go:117] "RemoveContainer" containerID="ea4b045645681e62c293bfefbb576073e2fa07c52f079bca371880f1947f65d5" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.718026 4939 scope.go:117] "RemoveContainer" containerID="8aa4b82fdca6e2f6d0abddeb69aff932ca7c56fdac289db4ed399ace38cfd256" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.771877 4939 scope.go:117] "RemoveContainer" containerID="5aae94ef3b4d56d980be83958fc352314dcd46ca846bf408e174a5b834da9b25" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.830704 4939 scope.go:117] "RemoveContainer" containerID="dcb7372718b4fe6090fcec1bb36cdf12aacbbf559cd22f635c2c37194aec8189" Mar 18 17:27:08 crc kubenswrapper[4939]: I0318 17:27:08.873408 4939 scope.go:117] "RemoveContainer" containerID="a16a8cce335f614963bcd245e46a37bbfd3839680f851239b16183abe352880a" Mar 18 17:27:10 crc kubenswrapper[4939]: I0318 17:27:10.306312 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:10 crc kubenswrapper[4939]: I0318 17:27:10.306372 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:10 crc kubenswrapper[4939]: I0318 17:27:10.361675 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:11 crc kubenswrapper[4939]: I0318 17:27:11.173220 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:11 crc kubenswrapper[4939]: I0318 17:27:11.745092 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.131937 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d65q2" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="registry-server" containerID="cri-o://5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589" gracePeriod=2 Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.706626 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.744099 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px6lw\" (UniqueName: \"kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw\") pod \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.744207 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities\") pod \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.744243 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content\") pod \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\" (UID: \"4baa01e5-70be-4262-8e84-4f7d15e6da4a\") " Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.762292 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities" (OuterVolumeSpecName: "utilities") pod "4baa01e5-70be-4262-8e84-4f7d15e6da4a" (UID: "4baa01e5-70be-4262-8e84-4f7d15e6da4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.787740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw" (OuterVolumeSpecName: "kube-api-access-px6lw") pod "4baa01e5-70be-4262-8e84-4f7d15e6da4a" (UID: "4baa01e5-70be-4262-8e84-4f7d15e6da4a"). InnerVolumeSpecName "kube-api-access-px6lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.848972 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px6lw\" (UniqueName: \"kubernetes.io/projected/4baa01e5-70be-4262-8e84-4f7d15e6da4a-kube-api-access-px6lw\") on node \"crc\" DevicePath \"\"" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.849014 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.876682 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4baa01e5-70be-4262-8e84-4f7d15e6da4a" (UID: "4baa01e5-70be-4262-8e84-4f7d15e6da4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:27:13 crc kubenswrapper[4939]: I0318 17:27:13.950338 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa01e5-70be-4262-8e84-4f7d15e6da4a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.143861 4939 generic.go:334] "Generic (PLEG): container finished" podID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerID="5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589" exitCode=0 Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.143950 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.146638 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerDied","Data":"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589"} Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.146696 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d65q2" event={"ID":"4baa01e5-70be-4262-8e84-4f7d15e6da4a","Type":"ContainerDied","Data":"7fea455592d584faee5667e76a5ae1e2131f26cbd6e91469f8cf08805a0e7102"} Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.146718 4939 scope.go:117] "RemoveContainer" containerID="5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.188558 4939 scope.go:117] "RemoveContainer" containerID="7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.212354 4939 scope.go:117] "RemoveContainer" containerID="0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.273008 4939 scope.go:117] "RemoveContainer" containerID="5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589" Mar 18 17:27:14 crc kubenswrapper[4939]: E0318 17:27:14.273533 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589\": container with ID starting with 5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589 not found: ID does not exist" containerID="5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.273570 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589"} err="failed to get container status \"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589\": rpc error: code = NotFound desc = could not find container \"5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589\": container with ID starting with 5e89a21159196a539e22eb4b56f44b06ef5c46e94d8cdeadcff5cb9c2c25d589 not found: ID does not exist" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.273591 4939 scope.go:117] "RemoveContainer" containerID="7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae" Mar 18 17:27:14 crc kubenswrapper[4939]: E0318 17:27:14.273936 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae\": container with ID starting with 7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae not found: ID does not exist" containerID="7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.273963 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae"} err="failed to get container status \"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae\": rpc error: code = NotFound desc = could not find container \"7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae\": container with ID starting with 7fe1bc17095db29ce612be031978d76c4fcb7d419aa6115cf23557885e2428ae not found: ID does not exist" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.273976 4939 scope.go:117] "RemoveContainer" containerID="0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06" Mar 18 17:27:14 crc kubenswrapper[4939]: E0318 17:27:14.274195 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06\": container with ID starting with 0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06 not found: ID does not exist" containerID="0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06" Mar 18 17:27:14 crc kubenswrapper[4939]: I0318 17:27:14.274226 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06"} err="failed to get container status \"0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06\": rpc error: code = NotFound desc = could not find container \"0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06\": container with ID starting with 0ec4c0366fde326af09a85d01ce7c7e66a250767d85c37efccdab308cee5bc06 not found: ID does not exist" Mar 18 17:27:23 crc kubenswrapper[4939]: I0318 17:27:23.687070 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:27:23 crc kubenswrapper[4939]: I0318 17:27:23.687799 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:27:44 crc kubenswrapper[4939]: I0318 17:27:44.159286 4939 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod4baa01e5-70be-4262-8e84-4f7d15e6da4a"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod4baa01e5-70be-4262-8e84-4f7d15e6da4a] : Timed out while waiting for systemd to remove kubepods-burstable-pod4baa01e5_70be_4262_8e84_4f7d15e6da4a.slice" Mar 18 17:27:44 crc kubenswrapper[4939]: E0318 17:27:44.159784 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod4baa01e5-70be-4262-8e84-4f7d15e6da4a] : unable to destroy cgroup paths for cgroup [kubepods burstable pod4baa01e5-70be-4262-8e84-4f7d15e6da4a] : Timed out while waiting for systemd to remove kubepods-burstable-pod4baa01e5_70be_4262_8e84_4f7d15e6da4a.slice" pod="openshift-marketplace/certified-operators-d65q2" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" Mar 18 17:27:44 crc kubenswrapper[4939]: I0318 17:27:44.476643 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d65q2" Mar 18 17:27:44 crc kubenswrapper[4939]: I0318 17:27:44.502958 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:27:44 crc kubenswrapper[4939]: I0318 17:27:44.514848 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d65q2"] Mar 18 17:27:46 crc kubenswrapper[4939]: I0318 17:27:46.145371 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" path="/var/lib/kubelet/pods/4baa01e5-70be-4262-8e84-4f7d15e6da4a/volumes" Mar 18 17:27:53 crc kubenswrapper[4939]: I0318 17:27:53.687589 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:27:53 crc kubenswrapper[4939]: I0318 17:27:53.688068 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:27:53 crc kubenswrapper[4939]: I0318 17:27:53.688112 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:27:53 crc kubenswrapper[4939]: I0318 17:27:53.688964 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:27:53 crc kubenswrapper[4939]: I0318 17:27:53.689025 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" gracePeriod=600 Mar 18 17:27:53 crc kubenswrapper[4939]: E0318 17:27:53.819030 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:27:54 crc kubenswrapper[4939]: I0318 17:27:54.578330 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" exitCode=0 Mar 18 17:27:54 crc kubenswrapper[4939]: I0318 17:27:54.578390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115"} Mar 18 17:27:54 crc kubenswrapper[4939]: I0318 17:27:54.578442 4939 scope.go:117] "RemoveContainer" containerID="57a41c1bc14cd97b3da450bf17c52dcdc4709996b2a45bd26d504a3261acd9e3" Mar 18 17:27:54 crc kubenswrapper[4939]: I0318 17:27:54.579356 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:27:54 crc kubenswrapper[4939]: E0318 17:27:54.579697 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.161367 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564248-q4vf4"] Mar 18 17:28:00 crc kubenswrapper[4939]: E0318 17:28:00.162321 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="extract-content" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.162334 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="extract-content" Mar 18 17:28:00 crc kubenswrapper[4939]: E0318 17:28:00.162371 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="extract-utilities" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.162378 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="extract-utilities" Mar 18 17:28:00 crc kubenswrapper[4939]: E0318 17:28:00.162395 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="registry-server" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.162401 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="registry-server" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.162605 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baa01e5-70be-4262-8e84-4f7d15e6da4a" containerName="registry-server" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.163433 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.166453 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.166456 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.171025 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.175785 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564248-q4vf4"] Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.317439 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktx2\" (UniqueName: \"kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2\") pod \"auto-csr-approver-29564248-q4vf4\" (UID: \"17abc635-1d15-48cc-83c6-4d727c0a4f99\") " pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.421032 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sktx2\" (UniqueName: \"kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2\") pod \"auto-csr-approver-29564248-q4vf4\" (UID: \"17abc635-1d15-48cc-83c6-4d727c0a4f99\") " pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.441891 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktx2\" (UniqueName: \"kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2\") pod \"auto-csr-approver-29564248-q4vf4\" (UID: \"17abc635-1d15-48cc-83c6-4d727c0a4f99\") " pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:00 crc kubenswrapper[4939]: I0318 17:28:00.505800 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:01 crc kubenswrapper[4939]: I0318 17:28:01.312350 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:28:01 crc kubenswrapper[4939]: I0318 17:28:01.315664 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564248-q4vf4"] Mar 18 17:28:02 crc kubenswrapper[4939]: I0318 17:28:02.183226 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" event={"ID":"17abc635-1d15-48cc-83c6-4d727c0a4f99","Type":"ContainerStarted","Data":"22fe5cfaac85cc91b0d9d14a9098ecc39ec0e2477bf50194230117e56b6acc14"} Mar 18 17:28:03 crc kubenswrapper[4939]: I0318 17:28:03.185947 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" event={"ID":"17abc635-1d15-48cc-83c6-4d727c0a4f99","Type":"ContainerDied","Data":"db46dec8aec82f73b2a735c56653047770340dfd44d92584a36dc8500d43f111"} Mar 18 17:28:03 crc kubenswrapper[4939]: I0318 17:28:03.186271 4939 generic.go:334] "Generic (PLEG): container finished" podID="17abc635-1d15-48cc-83c6-4d727c0a4f99" containerID="db46dec8aec82f73b2a735c56653047770340dfd44d92584a36dc8500d43f111" exitCode=0 Mar 18 17:28:04 crc kubenswrapper[4939]: I0318 17:28:04.624347 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:04 crc kubenswrapper[4939]: I0318 17:28:04.718479 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sktx2\" (UniqueName: \"kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2\") pod \"17abc635-1d15-48cc-83c6-4d727c0a4f99\" (UID: \"17abc635-1d15-48cc-83c6-4d727c0a4f99\") " Mar 18 17:28:04 crc kubenswrapper[4939]: I0318 17:28:04.723594 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2" (OuterVolumeSpecName: "kube-api-access-sktx2") pod "17abc635-1d15-48cc-83c6-4d727c0a4f99" (UID: "17abc635-1d15-48cc-83c6-4d727c0a4f99"). InnerVolumeSpecName "kube-api-access-sktx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:28:04 crc kubenswrapper[4939]: I0318 17:28:04.820392 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sktx2\" (UniqueName: \"kubernetes.io/projected/17abc635-1d15-48cc-83c6-4d727c0a4f99-kube-api-access-sktx2\") on node \"crc\" DevicePath \"\"" Mar 18 17:28:05 crc kubenswrapper[4939]: I0318 17:28:05.209914 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" event={"ID":"17abc635-1d15-48cc-83c6-4d727c0a4f99","Type":"ContainerDied","Data":"22fe5cfaac85cc91b0d9d14a9098ecc39ec0e2477bf50194230117e56b6acc14"} Mar 18 17:28:05 crc kubenswrapper[4939]: I0318 17:28:05.210120 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22fe5cfaac85cc91b0d9d14a9098ecc39ec0e2477bf50194230117e56b6acc14" Mar 18 17:28:05 crc kubenswrapper[4939]: I0318 17:28:05.209993 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564248-q4vf4" Mar 18 17:28:05 crc kubenswrapper[4939]: E0318 17:28:05.370305 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17abc635_1d15_48cc_83c6_4d727c0a4f99.slice/crio-22fe5cfaac85cc91b0d9d14a9098ecc39ec0e2477bf50194230117e56b6acc14\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17abc635_1d15_48cc_83c6_4d727c0a4f99.slice\": RecentStats: unable to find data in memory cache]" Mar 18 17:28:05 crc kubenswrapper[4939]: I0318 17:28:05.698266 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-s5t2d"] Mar 18 17:28:05 crc kubenswrapper[4939]: I0318 17:28:05.708032 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-s5t2d"] Mar 18 17:28:06 crc kubenswrapper[4939]: I0318 17:28:06.153768 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5368c541-c217-48f1-9c31-bf094896cad0" path="/var/lib/kubelet/pods/5368c541-c217-48f1-9c31-bf094896cad0/volumes" Mar 18 17:28:07 crc kubenswrapper[4939]: I0318 17:28:07.132821 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:28:07 crc kubenswrapper[4939]: E0318 17:28:07.133447 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:28:09 crc kubenswrapper[4939]: I0318 17:28:09.019780 4939 scope.go:117] "RemoveContainer" containerID="3226fdbc8d348f29f0abe28cf3d71a1f2fdadea5fc4c479171a79f8f617f2a3e" Mar 18 17:28:20 crc kubenswrapper[4939]: I0318 17:28:20.134729 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:28:20 crc kubenswrapper[4939]: E0318 17:28:20.136833 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:28:31 crc kubenswrapper[4939]: I0318 17:28:31.133552 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:28:31 crc kubenswrapper[4939]: E0318 17:28:31.134411 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:28:42 crc kubenswrapper[4939]: I0318 17:28:42.134881 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:28:42 crc kubenswrapper[4939]: E0318 17:28:42.135686 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:28:53 crc kubenswrapper[4939]: I0318 17:28:53.134032 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:28:53 crc kubenswrapper[4939]: E0318 17:28:53.134863 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:29:08 crc kubenswrapper[4939]: I0318 17:29:08.134309 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:29:08 crc kubenswrapper[4939]: E0318 17:29:08.135260 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:29:22 crc kubenswrapper[4939]: I0318 17:29:22.133566 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:29:22 crc kubenswrapper[4939]: E0318 17:29:22.134697 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.263781 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:25 crc kubenswrapper[4939]: E0318 17:29:25.264801 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17abc635-1d15-48cc-83c6-4d727c0a4f99" containerName="oc" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.264816 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="17abc635-1d15-48cc-83c6-4d727c0a4f99" containerName="oc" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.265077 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="17abc635-1d15-48cc-83c6-4d727c0a4f99" containerName="oc" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.267031 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.279268 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.372694 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.372748 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.372953 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfnj\" (UniqueName: \"kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.474828 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfnj\" (UniqueName: \"kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.474977 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.475007 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.475603 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.475705 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.501724 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfnj\" (UniqueName: \"kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj\") pod \"community-operators-svhq4\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:25 crc kubenswrapper[4939]: I0318 17:29:25.606243 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:26 crc kubenswrapper[4939]: I0318 17:29:26.091429 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:26 crc kubenswrapper[4939]: I0318 17:29:26.690195 4939 generic.go:334] "Generic (PLEG): container finished" podID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerID="d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f" exitCode=0 Mar 18 17:29:26 crc kubenswrapper[4939]: I0318 17:29:26.690264 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerDied","Data":"d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f"} Mar 18 17:29:26 crc kubenswrapper[4939]: I0318 17:29:26.690479 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerStarted","Data":"f1e144be263f7a85245195c660c686b6e01bb118696f40f651589ceb6271205d"} Mar 18 17:29:27 crc kubenswrapper[4939]: I0318 17:29:27.701764 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerStarted","Data":"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c"} Mar 18 17:29:28 crc kubenswrapper[4939]: I0318 17:29:28.713181 4939 generic.go:334] "Generic (PLEG): container finished" podID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerID="cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c" exitCode=0 Mar 18 17:29:28 crc kubenswrapper[4939]: I0318 17:29:28.713240 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerDied","Data":"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c"} Mar 18 17:29:29 crc kubenswrapper[4939]: I0318 17:29:29.724273 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerStarted","Data":"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179"} Mar 18 17:29:29 crc kubenswrapper[4939]: I0318 17:29:29.745267 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svhq4" podStartSLOduration=2.187074755 podStartE2EDuration="4.745247029s" podCreationTimestamp="2026-03-18 17:29:25 +0000 UTC" firstStartedPulling="2026-03-18 17:29:26.692389407 +0000 UTC m=+6731.291577068" lastFinishedPulling="2026-03-18 17:29:29.250561721 +0000 UTC m=+6733.849749342" observedRunningTime="2026-03-18 17:29:29.742374107 +0000 UTC m=+6734.341561728" watchObservedRunningTime="2026-03-18 17:29:29.745247029 +0000 UTC m=+6734.344434650" Mar 18 17:29:35 crc kubenswrapper[4939]: I0318 17:29:35.606984 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:35 crc kubenswrapper[4939]: I0318 17:29:35.607448 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:35 crc kubenswrapper[4939]: I0318 17:29:35.666029 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:35 crc kubenswrapper[4939]: I0318 17:29:35.831160 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:35 crc kubenswrapper[4939]: I0318 17:29:35.900951 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:36 crc kubenswrapper[4939]: I0318 17:29:36.146956 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:29:36 crc kubenswrapper[4939]: E0318 17:29:36.147629 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:29:37 crc kubenswrapper[4939]: I0318 17:29:37.800904 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svhq4" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="registry-server" containerID="cri-o://3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179" gracePeriod=2 Mar 18 17:29:37 crc kubenswrapper[4939]: E0318 17:29:37.924117 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e74910c_57a2_4386_aa23_cbd796328cb5.slice/crio-3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.049950 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-hp82l"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.075085 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-hp82l"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.093610 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-5ee2-account-create-update-xrjf5"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.103938 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-5ee2-account-create-update-xrjf5"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.144154 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4786eb-9145-45be-bcd2-4771ce87936e" path="/var/lib/kubelet/pods/4b4786eb-9145-45be-bcd2-4771ce87936e/volumes" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.146487 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e24266-e4ab-4f96-8749-8fcc70366102" path="/var/lib/kubelet/pods/d4e24266-e4ab-4f96-8749-8fcc70366102/volumes" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.307267 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.353847 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities\") pod \"0e74910c-57a2-4386-aa23-cbd796328cb5\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.353976 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content\") pod \"0e74910c-57a2-4386-aa23-cbd796328cb5\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.354048 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfnj\" (UniqueName: \"kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj\") pod \"0e74910c-57a2-4386-aa23-cbd796328cb5\" (UID: \"0e74910c-57a2-4386-aa23-cbd796328cb5\") " Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.354995 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities" (OuterVolumeSpecName: "utilities") pod "0e74910c-57a2-4386-aa23-cbd796328cb5" (UID: "0e74910c-57a2-4386-aa23-cbd796328cb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.362069 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj" (OuterVolumeSpecName: "kube-api-access-xwfnj") pod "0e74910c-57a2-4386-aa23-cbd796328cb5" (UID: "0e74910c-57a2-4386-aa23-cbd796328cb5"). InnerVolumeSpecName "kube-api-access-xwfnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.411104 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e74910c-57a2-4386-aa23-cbd796328cb5" (UID: "0e74910c-57a2-4386-aa23-cbd796328cb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.456069 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.456110 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwfnj\" (UniqueName: \"kubernetes.io/projected/0e74910c-57a2-4386-aa23-cbd796328cb5-kube-api-access-xwfnj\") on node \"crc\" DevicePath \"\"" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.456120 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74910c-57a2-4386-aa23-cbd796328cb5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.810881 4939 generic.go:334] "Generic (PLEG): container finished" podID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerID="3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179" exitCode=0 Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.810931 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerDied","Data":"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179"} Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.810996 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svhq4" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.811018 4939 scope.go:117] "RemoveContainer" containerID="3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.811002 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svhq4" event={"ID":"0e74910c-57a2-4386-aa23-cbd796328cb5","Type":"ContainerDied","Data":"f1e144be263f7a85245195c660c686b6e01bb118696f40f651589ceb6271205d"} Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.832700 4939 scope.go:117] "RemoveContainer" containerID="cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.858984 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.874337 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svhq4"] Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.874351 4939 scope.go:117] "RemoveContainer" containerID="d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.922691 4939 scope.go:117] "RemoveContainer" containerID="3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179" Mar 18 17:29:38 crc kubenswrapper[4939]: E0318 17:29:38.923185 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179\": container with ID starting with 3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179 not found: ID does not exist" containerID="3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.923236 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179"} err="failed to get container status \"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179\": rpc error: code = NotFound desc = could not find container \"3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179\": container with ID starting with 3c7dc808ca9d06732572c13df89d4c8d176f244467cf5512dc81eb9f3f7ee179 not found: ID does not exist" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.923266 4939 scope.go:117] "RemoveContainer" containerID="cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c" Mar 18 17:29:38 crc kubenswrapper[4939]: E0318 17:29:38.923610 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c\": container with ID starting with cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c not found: ID does not exist" containerID="cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.923659 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c"} err="failed to get container status \"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c\": rpc error: code = NotFound desc = could not find container \"cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c\": container with ID starting with cff2391dd1d6ca4771116dd80eb1b6b2fed754774be8ddbfe5c8d5086df1e29c not found: ID does not exist" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.923694 4939 scope.go:117] "RemoveContainer" containerID="d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f" Mar 18 17:29:38 crc kubenswrapper[4939]: E0318 17:29:38.924166 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f\": container with ID starting with d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f not found: ID does not exist" containerID="d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f" Mar 18 17:29:38 crc kubenswrapper[4939]: I0318 17:29:38.924305 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f"} err="failed to get container status \"d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f\": rpc error: code = NotFound desc = could not find container \"d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f\": container with ID starting with d3fd98a536cb2d34bfcd188aea49ba07f816ee76d19818f502f15ef5c1cc4f6f not found: ID does not exist" Mar 18 17:29:40 crc kubenswrapper[4939]: I0318 17:29:40.147621 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" path="/var/lib/kubelet/pods/0e74910c-57a2-4386-aa23-cbd796328cb5/volumes" Mar 18 17:29:51 crc kubenswrapper[4939]: I0318 17:29:51.041357 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-8tp6q"] Mar 18 17:29:51 crc kubenswrapper[4939]: I0318 17:29:51.057409 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-8tp6q"] Mar 18 17:29:51 crc kubenswrapper[4939]: I0318 17:29:51.134097 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:29:51 crc kubenswrapper[4939]: E0318 17:29:51.134403 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:29:52 crc kubenswrapper[4939]: I0318 17:29:52.152974 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a209284b-60ea-46b1-844d-e757771e0567" path="/var/lib/kubelet/pods/a209284b-60ea-46b1-844d-e757771e0567/volumes" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.157863 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq"] Mar 18 17:30:00 crc kubenswrapper[4939]: E0318 17:30:00.159192 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="extract-content" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.159223 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="extract-content" Mar 18 17:30:00 crc kubenswrapper[4939]: E0318 17:30:00.159272 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="extract-utilities" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.159285 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="extract-utilities" Mar 18 17:30:00 crc kubenswrapper[4939]: E0318 17:30:00.159328 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="registry-server" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.159341 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="registry-server" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.159738 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e74910c-57a2-4386-aa23-cbd796328cb5" containerName="registry-server" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.161012 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.162830 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.162936 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.170683 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564250-cllmx"] Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.172456 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.174037 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.176807 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.176975 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.189364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.189471 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthzl\" (UniqueName: \"kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl\") pod \"auto-csr-approver-29564250-cllmx\" (UID: \"b36e7d38-a4d7-443b-a5b4-8e18d037476a\") " pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.189565 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngkf\" (UniqueName: \"kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.189718 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.198231 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq"] Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.209685 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564250-cllmx"] Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.290705 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngkf\" (UniqueName: \"kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.290840 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.291035 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.291092 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthzl\" (UniqueName: \"kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl\") pod \"auto-csr-approver-29564250-cllmx\" (UID: \"b36e7d38-a4d7-443b-a5b4-8e18d037476a\") " pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.291887 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.298673 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.310546 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthzl\" (UniqueName: \"kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl\") pod \"auto-csr-approver-29564250-cllmx\" (UID: \"b36e7d38-a4d7-443b-a5b4-8e18d037476a\") " pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.310758 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngkf\" (UniqueName: \"kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf\") pod \"collect-profiles-29564250-dp5sq\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.491983 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.510051 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:00 crc kubenswrapper[4939]: I0318 17:30:00.961412 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq"] Mar 18 17:30:01 crc kubenswrapper[4939]: I0318 17:30:01.032887 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" event={"ID":"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2","Type":"ContainerStarted","Data":"49fae3f3e9d363083b5c70756bff0d078190cc6bf58c4c20c5b678ae5077acf5"} Mar 18 17:30:01 crc kubenswrapper[4939]: I0318 17:30:01.066423 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564250-cllmx"] Mar 18 17:30:01 crc kubenswrapper[4939]: W0318 17:30:01.069645 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb36e7d38_a4d7_443b_a5b4_8e18d037476a.slice/crio-09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29 WatchSource:0}: Error finding container 09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29: Status 404 returned error can't find the container with id 09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29 Mar 18 17:30:02 crc kubenswrapper[4939]: I0318 17:30:02.045893 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564250-cllmx" event={"ID":"b36e7d38-a4d7-443b-a5b4-8e18d037476a","Type":"ContainerStarted","Data":"09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29"} Mar 18 17:30:02 crc kubenswrapper[4939]: I0318 17:30:02.048745 4939 generic.go:334] "Generic (PLEG): container finished" podID="e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" containerID="044fd6eadf5968fac6fb8d64214674386444b57560c8399c1c6016759c71f945" exitCode=0 Mar 18 17:30:02 crc kubenswrapper[4939]: I0318 17:30:02.048813 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" event={"ID":"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2","Type":"ContainerDied","Data":"044fd6eadf5968fac6fb8d64214674386444b57560c8399c1c6016759c71f945"} Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.146264 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:30:03 crc kubenswrapper[4939]: E0318 17:30:03.147043 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.498461 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.565222 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume\") pod \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.565679 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume\") pod \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.565767 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kngkf\" (UniqueName: \"kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf\") pod \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\" (UID: \"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2\") " Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.567172 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" (UID: "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.571689 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" (UID: "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.571935 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf" (OuterVolumeSpecName: "kube-api-access-kngkf") pod "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" (UID: "e7263eb2-dd4c-4cf5-ac7b-eae0748723e2"). InnerVolumeSpecName "kube-api-access-kngkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.668439 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kngkf\" (UniqueName: \"kubernetes.io/projected/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-kube-api-access-kngkf\") on node \"crc\" DevicePath \"\"" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.668496 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:30:03 crc kubenswrapper[4939]: I0318 17:30:03.668529 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:30:04 crc kubenswrapper[4939]: I0318 17:30:04.071883 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" event={"ID":"e7263eb2-dd4c-4cf5-ac7b-eae0748723e2","Type":"ContainerDied","Data":"49fae3f3e9d363083b5c70756bff0d078190cc6bf58c4c20c5b678ae5077acf5"} Mar 18 17:30:04 crc kubenswrapper[4939]: I0318 17:30:04.072148 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49fae3f3e9d363083b5c70756bff0d078190cc6bf58c4c20c5b678ae5077acf5" Mar 18 17:30:04 crc kubenswrapper[4939]: I0318 17:30:04.071961 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq" Mar 18 17:30:04 crc kubenswrapper[4939]: I0318 17:30:04.570351 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6"] Mar 18 17:30:04 crc kubenswrapper[4939]: I0318 17:30:04.579165 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-dcsh6"] Mar 18 17:30:05 crc kubenswrapper[4939]: I0318 17:30:05.085975 4939 generic.go:334] "Generic (PLEG): container finished" podID="b36e7d38-a4d7-443b-a5b4-8e18d037476a" containerID="b50ad7e4106325b5ac7c87fabb559eaa3075126fd28e82cbcc276bbe7ad92bb5" exitCode=0 Mar 18 17:30:05 crc kubenswrapper[4939]: I0318 17:30:05.086034 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564250-cllmx" event={"ID":"b36e7d38-a4d7-443b-a5b4-8e18d037476a","Type":"ContainerDied","Data":"b50ad7e4106325b5ac7c87fabb559eaa3075126fd28e82cbcc276bbe7ad92bb5"} Mar 18 17:30:06 crc kubenswrapper[4939]: I0318 17:30:06.149185 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f14f20-580e-4ceb-8973-48517454057a" path="/var/lib/kubelet/pods/e9f14f20-580e-4ceb-8973-48517454057a/volumes" Mar 18 17:30:06 crc kubenswrapper[4939]: I0318 17:30:06.564736 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:06 crc kubenswrapper[4939]: I0318 17:30:06.642291 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vthzl\" (UniqueName: \"kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl\") pod \"b36e7d38-a4d7-443b-a5b4-8e18d037476a\" (UID: \"b36e7d38-a4d7-443b-a5b4-8e18d037476a\") " Mar 18 17:30:06 crc kubenswrapper[4939]: I0318 17:30:06.647258 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl" (OuterVolumeSpecName: "kube-api-access-vthzl") pod "b36e7d38-a4d7-443b-a5b4-8e18d037476a" (UID: "b36e7d38-a4d7-443b-a5b4-8e18d037476a"). InnerVolumeSpecName "kube-api-access-vthzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:30:06 crc kubenswrapper[4939]: I0318 17:30:06.744460 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vthzl\" (UniqueName: \"kubernetes.io/projected/b36e7d38-a4d7-443b-a5b4-8e18d037476a-kube-api-access-vthzl\") on node \"crc\" DevicePath \"\"" Mar 18 17:30:07 crc kubenswrapper[4939]: I0318 17:30:07.106876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564250-cllmx" event={"ID":"b36e7d38-a4d7-443b-a5b4-8e18d037476a","Type":"ContainerDied","Data":"09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29"} Mar 18 17:30:07 crc kubenswrapper[4939]: I0318 17:30:07.106930 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c03bc3fd24e060dfb468933d1ec9bb8b094f82238a535436bece98e0a32d29" Mar 18 17:30:07 crc kubenswrapper[4939]: I0318 17:30:07.106952 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564250-cllmx" Mar 18 17:30:07 crc kubenswrapper[4939]: I0318 17:30:07.628183 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-kgbhm"] Mar 18 17:30:07 crc kubenswrapper[4939]: I0318 17:30:07.640137 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-kgbhm"] Mar 18 17:30:08 crc kubenswrapper[4939]: I0318 17:30:08.150043 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c090d7-9f74-40fa-8ec5-e0364fce49ae" path="/var/lib/kubelet/pods/89c090d7-9f74-40fa-8ec5-e0364fce49ae/volumes" Mar 18 17:30:09 crc kubenswrapper[4939]: I0318 17:30:09.133628 4939 scope.go:117] "RemoveContainer" containerID="fa0c1678f09a5f9b844e59b9283456e7a6118df5773990a4ec5f89f1859508d0" Mar 18 17:30:09 crc kubenswrapper[4939]: I0318 17:30:09.186636 4939 scope.go:117] "RemoveContainer" containerID="4dc4b2cb200986b8bad6585d1f4f7611a450ac3e090271a684a92aa884828d14" Mar 18 17:30:09 crc kubenswrapper[4939]: I0318 17:30:09.209622 4939 scope.go:117] "RemoveContainer" containerID="d2b7ad8427f9414749dbbb15e7e745438a56bc58b4827539a0afdaab07db36ba" Mar 18 17:30:09 crc kubenswrapper[4939]: I0318 17:30:09.278468 4939 scope.go:117] "RemoveContainer" containerID="a6982fe013da4425b9ee6a75b0bfa75b90b4f0be0766fdccd67e91f0b6ea13f7" Mar 18 17:30:09 crc kubenswrapper[4939]: I0318 17:30:09.351360 4939 scope.go:117] "RemoveContainer" containerID="39150643f63c7adf6f50ecac7c87ed326d9b359fc2760be370790a02e634573f" Mar 18 17:30:17 crc kubenswrapper[4939]: I0318 17:30:17.133401 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:30:17 crc kubenswrapper[4939]: E0318 17:30:17.134342 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:30:31 crc kubenswrapper[4939]: I0318 17:30:31.133569 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:30:31 crc kubenswrapper[4939]: E0318 17:30:31.134243 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:30:45 crc kubenswrapper[4939]: I0318 17:30:45.133355 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:30:45 crc kubenswrapper[4939]: E0318 17:30:45.133983 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:30:58 crc kubenswrapper[4939]: I0318 17:30:58.134019 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:30:58 crc kubenswrapper[4939]: E0318 17:30:58.134822 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.196710 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:05 crc kubenswrapper[4939]: E0318 17:31:05.197791 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" containerName="collect-profiles" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.197805 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" containerName="collect-profiles" Mar 18 17:31:05 crc kubenswrapper[4939]: E0318 17:31:05.197818 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36e7d38-a4d7-443b-a5b4-8e18d037476a" containerName="oc" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.197825 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36e7d38-a4d7-443b-a5b4-8e18d037476a" containerName="oc" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.198060 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36e7d38-a4d7-443b-a5b4-8e18d037476a" containerName="oc" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.198079 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" containerName="collect-profiles" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.200181 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.216457 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.293226 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.293387 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.293409 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rft9d\" (UniqueName: \"kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.395219 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.395271 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rft9d\" (UniqueName: \"kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.395402 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.395898 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.396018 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.417787 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rft9d\" (UniqueName: \"kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d\") pod \"redhat-operators-k289d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:05 crc kubenswrapper[4939]: I0318 17:31:05.569784 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:06 crc kubenswrapper[4939]: I0318 17:31:06.059299 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:06 crc kubenswrapper[4939]: I0318 17:31:06.818295 4939 generic.go:334] "Generic (PLEG): container finished" podID="14586ab2-893a-4230-8ba8-2d91be78675d" containerID="5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff" exitCode=0 Mar 18 17:31:06 crc kubenswrapper[4939]: I0318 17:31:06.818404 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerDied","Data":"5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff"} Mar 18 17:31:06 crc kubenswrapper[4939]: I0318 17:31:06.818658 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerStarted","Data":"bca4f9bede233a44ee739b21af1c29365c38eb5dfa10c32bd8684199c635e1a9"} Mar 18 17:31:08 crc kubenswrapper[4939]: I0318 17:31:08.836488 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerStarted","Data":"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561"} Mar 18 17:31:10 crc kubenswrapper[4939]: I0318 17:31:10.133271 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:31:10 crc kubenswrapper[4939]: E0318 17:31:10.133796 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:31:13 crc kubenswrapper[4939]: I0318 17:31:13.889712 4939 generic.go:334] "Generic (PLEG): container finished" podID="14586ab2-893a-4230-8ba8-2d91be78675d" containerID="ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561" exitCode=0 Mar 18 17:31:13 crc kubenswrapper[4939]: I0318 17:31:13.889800 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerDied","Data":"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561"} Mar 18 17:31:20 crc kubenswrapper[4939]: I0318 17:31:20.952696 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerStarted","Data":"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f"} Mar 18 17:31:20 crc kubenswrapper[4939]: I0318 17:31:20.980554 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k289d" podStartSLOduration=2.868601819 podStartE2EDuration="15.980515463s" podCreationTimestamp="2026-03-18 17:31:05 +0000 UTC" firstStartedPulling="2026-03-18 17:31:06.82028544 +0000 UTC m=+6831.419473061" lastFinishedPulling="2026-03-18 17:31:19.932199084 +0000 UTC m=+6844.531386705" observedRunningTime="2026-03-18 17:31:20.969437128 +0000 UTC m=+6845.568624759" watchObservedRunningTime="2026-03-18 17:31:20.980515463 +0000 UTC m=+6845.579703084" Mar 18 17:31:25 crc kubenswrapper[4939]: I0318 17:31:25.134290 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:31:25 crc kubenswrapper[4939]: E0318 17:31:25.135231 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:31:25 crc kubenswrapper[4939]: I0318 17:31:25.570633 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:25 crc kubenswrapper[4939]: I0318 17:31:25.571008 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:26 crc kubenswrapper[4939]: I0318 17:31:26.621845 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k289d" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:31:26 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:31:26 crc kubenswrapper[4939]: > Mar 18 17:31:36 crc kubenswrapper[4939]: I0318 17:31:36.142842 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:31:36 crc kubenswrapper[4939]: E0318 17:31:36.144657 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:31:36 crc kubenswrapper[4939]: I0318 17:31:36.630213 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k289d" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:31:36 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:31:36 crc kubenswrapper[4939]: > Mar 18 17:31:46 crc kubenswrapper[4939]: I0318 17:31:46.617748 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k289d" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" probeResult="failure" output=< Mar 18 17:31:46 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:31:46 crc kubenswrapper[4939]: > Mar 18 17:31:49 crc kubenswrapper[4939]: I0318 17:31:49.134559 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:31:49 crc kubenswrapper[4939]: E0318 17:31:49.135078 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:31:55 crc kubenswrapper[4939]: I0318 17:31:55.628136 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:55 crc kubenswrapper[4939]: I0318 17:31:55.682925 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:55 crc kubenswrapper[4939]: I0318 17:31:55.862620 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.311911 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k289d" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" containerID="cri-o://d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f" gracePeriod=2 Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.863854 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.942342 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rft9d\" (UniqueName: \"kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d\") pod \"14586ab2-893a-4230-8ba8-2d91be78675d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.942645 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content\") pod \"14586ab2-893a-4230-8ba8-2d91be78675d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.942675 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities\") pod \"14586ab2-893a-4230-8ba8-2d91be78675d\" (UID: \"14586ab2-893a-4230-8ba8-2d91be78675d\") " Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.943617 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities" (OuterVolumeSpecName: "utilities") pod "14586ab2-893a-4230-8ba8-2d91be78675d" (UID: "14586ab2-893a-4230-8ba8-2d91be78675d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:31:57 crc kubenswrapper[4939]: I0318 17:31:57.948352 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d" (OuterVolumeSpecName: "kube-api-access-rft9d") pod "14586ab2-893a-4230-8ba8-2d91be78675d" (UID: "14586ab2-893a-4230-8ba8-2d91be78675d"). InnerVolumeSpecName "kube-api-access-rft9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.045048 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.045329 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rft9d\" (UniqueName: \"kubernetes.io/projected/14586ab2-893a-4230-8ba8-2d91be78675d-kube-api-access-rft9d\") on node \"crc\" DevicePath \"\"" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.097933 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14586ab2-893a-4230-8ba8-2d91be78675d" (UID: "14586ab2-893a-4230-8ba8-2d91be78675d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.147634 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14586ab2-893a-4230-8ba8-2d91be78675d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.323880 4939 generic.go:334] "Generic (PLEG): container finished" podID="14586ab2-893a-4230-8ba8-2d91be78675d" containerID="d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f" exitCode=0 Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.323922 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerDied","Data":"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f"} Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.323949 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k289d" event={"ID":"14586ab2-893a-4230-8ba8-2d91be78675d","Type":"ContainerDied","Data":"bca4f9bede233a44ee739b21af1c29365c38eb5dfa10c32bd8684199c635e1a9"} Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.323965 4939 scope.go:117] "RemoveContainer" containerID="d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.323970 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k289d" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.352487 4939 scope.go:117] "RemoveContainer" containerID="ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.354366 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.365918 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k289d"] Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.373212 4939 scope.go:117] "RemoveContainer" containerID="5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.425470 4939 scope.go:117] "RemoveContainer" containerID="d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f" Mar 18 17:31:58 crc kubenswrapper[4939]: E0318 17:31:58.425949 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f\": container with ID starting with d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f not found: ID does not exist" containerID="d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.425977 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f"} err="failed to get container status \"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f\": rpc error: code = NotFound desc = could not find container \"d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f\": container with ID starting with d3c685c8e851441577a21f89afd90a12b27bace833f2133ff11f99861c2f6f9f not found: ID does not exist" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.425995 4939 scope.go:117] "RemoveContainer" containerID="ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561" Mar 18 17:31:58 crc kubenswrapper[4939]: E0318 17:31:58.426522 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561\": container with ID starting with ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561 not found: ID does not exist" containerID="ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.426553 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561"} err="failed to get container status \"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561\": rpc error: code = NotFound desc = could not find container \"ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561\": container with ID starting with ba8060db81fff196c3278c0364f4ff09a23fdf82d6f68f7fb7772cda42e7e561 not found: ID does not exist" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.426567 4939 scope.go:117] "RemoveContainer" containerID="5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff" Mar 18 17:31:58 crc kubenswrapper[4939]: E0318 17:31:58.426859 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff\": container with ID starting with 5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff not found: ID does not exist" containerID="5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff" Mar 18 17:31:58 crc kubenswrapper[4939]: I0318 17:31:58.426932 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff"} err="failed to get container status \"5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff\": rpc error: code = NotFound desc = could not find container \"5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff\": container with ID starting with 5b0d9e21ed256bd3c38decd963a5720b434a92109b76e4f7496bab94c0c6d6ff not found: ID does not exist" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.150992 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" path="/var/lib/kubelet/pods/14586ab2-893a-4230-8ba8-2d91be78675d/volumes" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.171289 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564252-kwr9s"] Mar 18 17:32:00 crc kubenswrapper[4939]: E0318 17:32:00.171845 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="extract-content" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.171866 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="extract-content" Mar 18 17:32:00 crc kubenswrapper[4939]: E0318 17:32:00.171914 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="extract-utilities" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.171928 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="extract-utilities" Mar 18 17:32:00 crc kubenswrapper[4939]: E0318 17:32:00.171944 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.171951 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.172199 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="14586ab2-893a-4230-8ba8-2d91be78675d" containerName="registry-server" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.173113 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.175529 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.175755 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.175878 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.291728 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xptm\" (UniqueName: \"kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm\") pod \"auto-csr-approver-29564252-kwr9s\" (UID: \"b9ee94aa-72e5-42cb-8829-d2655220e323\") " pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.394763 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xptm\" (UniqueName: \"kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm\") pod \"auto-csr-approver-29564252-kwr9s\" (UID: \"b9ee94aa-72e5-42cb-8829-d2655220e323\") " pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.401297 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564252-kwr9s"] Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.426329 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xptm\" (UniqueName: \"kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm\") pod \"auto-csr-approver-29564252-kwr9s\" (UID: \"b9ee94aa-72e5-42cb-8829-d2655220e323\") " pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.494912 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:00 crc kubenswrapper[4939]: I0318 17:32:00.976255 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564252-kwr9s"] Mar 18 17:32:01 crc kubenswrapper[4939]: I0318 17:32:01.357536 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" event={"ID":"b9ee94aa-72e5-42cb-8829-d2655220e323","Type":"ContainerStarted","Data":"a19cde81a2b342d4c832a1ef51bf6854d5099769472a0c02b8c23f0ef4708000"} Mar 18 17:32:03 crc kubenswrapper[4939]: I0318 17:32:03.133532 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:32:03 crc kubenswrapper[4939]: E0318 17:32:03.134213 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:32:03 crc kubenswrapper[4939]: I0318 17:32:03.380002 4939 generic.go:334] "Generic (PLEG): container finished" podID="b9ee94aa-72e5-42cb-8829-d2655220e323" containerID="09103d1ec358baefe85ad2abd8f2e9ad669d71046163db755938c035c38e8185" exitCode=0 Mar 18 17:32:03 crc kubenswrapper[4939]: I0318 17:32:03.380086 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" event={"ID":"b9ee94aa-72e5-42cb-8829-d2655220e323","Type":"ContainerDied","Data":"09103d1ec358baefe85ad2abd8f2e9ad669d71046163db755938c035c38e8185"} Mar 18 17:32:04 crc kubenswrapper[4939]: I0318 17:32:04.779346 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:04 crc kubenswrapper[4939]: I0318 17:32:04.896376 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xptm\" (UniqueName: \"kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm\") pod \"b9ee94aa-72e5-42cb-8829-d2655220e323\" (UID: \"b9ee94aa-72e5-42cb-8829-d2655220e323\") " Mar 18 17:32:04 crc kubenswrapper[4939]: I0318 17:32:04.909569 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm" (OuterVolumeSpecName: "kube-api-access-6xptm") pod "b9ee94aa-72e5-42cb-8829-d2655220e323" (UID: "b9ee94aa-72e5-42cb-8829-d2655220e323"). InnerVolumeSpecName "kube-api-access-6xptm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:32:04 crc kubenswrapper[4939]: I0318 17:32:04.999904 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xptm\" (UniqueName: \"kubernetes.io/projected/b9ee94aa-72e5-42cb-8829-d2655220e323-kube-api-access-6xptm\") on node \"crc\" DevicePath \"\"" Mar 18 17:32:05 crc kubenswrapper[4939]: I0318 17:32:05.402071 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" event={"ID":"b9ee94aa-72e5-42cb-8829-d2655220e323","Type":"ContainerDied","Data":"a19cde81a2b342d4c832a1ef51bf6854d5099769472a0c02b8c23f0ef4708000"} Mar 18 17:32:05 crc kubenswrapper[4939]: I0318 17:32:05.402446 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19cde81a2b342d4c832a1ef51bf6854d5099769472a0c02b8c23f0ef4708000" Mar 18 17:32:05 crc kubenswrapper[4939]: I0318 17:32:05.402107 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564252-kwr9s" Mar 18 17:32:05 crc kubenswrapper[4939]: I0318 17:32:05.849454 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564246-jrjv8"] Mar 18 17:32:05 crc kubenswrapper[4939]: I0318 17:32:05.858772 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564246-jrjv8"] Mar 18 17:32:06 crc kubenswrapper[4939]: I0318 17:32:06.153705 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1aa391-1e5e-4b89-a2ec-85a986c66474" path="/var/lib/kubelet/pods/6b1aa391-1e5e-4b89-a2ec-85a986c66474/volumes" Mar 18 17:32:09 crc kubenswrapper[4939]: I0318 17:32:09.552489 4939 scope.go:117] "RemoveContainer" containerID="4a3da4c6bd09504c037f462a528942cc5c5f73b7f29be700adf4cbbb06794a93" Mar 18 17:32:18 crc kubenswrapper[4939]: I0318 17:32:18.133997 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:32:18 crc kubenswrapper[4939]: E0318 17:32:18.134936 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.049837 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-36a9-account-create-update-7xbnx"] Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.079644 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-fgrm6"] Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.079715 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-fgrm6"] Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.090034 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-36a9-account-create-update-7xbnx"] Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.144859 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a70d715-2f49-4a60-9d49-5ee1808f5d89" path="/var/lib/kubelet/pods/1a70d715-2f49-4a60-9d49-5ee1808f5d89/volumes" Mar 18 17:32:28 crc kubenswrapper[4939]: I0318 17:32:28.147657 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb07a0b-8d95-4512-bf1a-ab64f702f8e7" path="/var/lib/kubelet/pods/3eb07a0b-8d95-4512-bf1a-ab64f702f8e7/volumes" Mar 18 17:32:31 crc kubenswrapper[4939]: I0318 17:32:31.133428 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:32:31 crc kubenswrapper[4939]: E0318 17:32:31.133952 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:32:39 crc kubenswrapper[4939]: I0318 17:32:39.029643 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-f7rb6"] Mar 18 17:32:39 crc kubenswrapper[4939]: I0318 17:32:39.040350 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-f7rb6"] Mar 18 17:32:40 crc kubenswrapper[4939]: I0318 17:32:40.148031 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a068ef28-b3de-4a14-a8d2-3cf38f55864d" path="/var/lib/kubelet/pods/a068ef28-b3de-4a14-a8d2-3cf38f55864d/volumes" Mar 18 17:32:43 crc kubenswrapper[4939]: I0318 17:32:43.134382 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:32:43 crc kubenswrapper[4939]: E0318 17:32:43.135595 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:32:55 crc kubenswrapper[4939]: I0318 17:32:55.133934 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:32:55 crc kubenswrapper[4939]: I0318 17:32:55.915818 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755"} Mar 18 17:33:01 crc kubenswrapper[4939]: I0318 17:33:01.040212 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wtjcn"] Mar 18 17:33:01 crc kubenswrapper[4939]: I0318 17:33:01.050130 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-96ca-account-create-update-gxftp"] Mar 18 17:33:01 crc kubenswrapper[4939]: I0318 17:33:01.059967 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wtjcn"] Mar 18 17:33:01 crc kubenswrapper[4939]: I0318 17:33:01.068063 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-96ca-account-create-update-gxftp"] Mar 18 17:33:02 crc kubenswrapper[4939]: I0318 17:33:02.147381 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d97d157-854d-423d-b4b2-bf11a05cf07b" path="/var/lib/kubelet/pods/2d97d157-854d-423d-b4b2-bf11a05cf07b/volumes" Mar 18 17:33:02 crc kubenswrapper[4939]: I0318 17:33:02.148745 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dad903b-41e6-4f83-8e55-a44e1a891716" path="/var/lib/kubelet/pods/2dad903b-41e6-4f83-8e55-a44e1a891716/volumes" Mar 18 17:33:09 crc kubenswrapper[4939]: I0318 17:33:09.643806 4939 scope.go:117] "RemoveContainer" containerID="1b9758729629f7b84aabbee7890c61d2186e75c3641b70640c0d717165352ce9" Mar 18 17:33:09 crc kubenswrapper[4939]: I0318 17:33:09.664962 4939 scope.go:117] "RemoveContainer" containerID="9404162434f281a43876a4a06d6727b558578c242dca13e25a0eae60ce360e94" Mar 18 17:33:09 crc kubenswrapper[4939]: I0318 17:33:09.719212 4939 scope.go:117] "RemoveContainer" containerID="9c2a599b733f3b5cb6a83cf83d76dc3970eee464e07d0bd322d2008b2036d9d0" Mar 18 17:33:09 crc kubenswrapper[4939]: I0318 17:33:09.763193 4939 scope.go:117] "RemoveContainer" containerID="e0f84934e34c1a2e41bc5c44c5776b1c44c7c245eb806e004ae50fa573502fe7" Mar 18 17:33:09 crc kubenswrapper[4939]: I0318 17:33:09.808038 4939 scope.go:117] "RemoveContainer" containerID="3c9ee260eada23c451002c6c6b412b1d665e989d923a42b56867f893fb0ff438" Mar 18 17:33:13 crc kubenswrapper[4939]: I0318 17:33:13.030216 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-cfzqj"] Mar 18 17:33:13 crc kubenswrapper[4939]: I0318 17:33:13.039639 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-cfzqj"] Mar 18 17:33:14 crc kubenswrapper[4939]: I0318 17:33:14.143400 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d98593-3990-4191-8867-efe09b71260f" path="/var/lib/kubelet/pods/32d98593-3990-4191-8867-efe09b71260f/volumes" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.150373 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564254-khfb2"] Mar 18 17:34:00 crc kubenswrapper[4939]: E0318 17:34:00.151311 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ee94aa-72e5-42cb-8829-d2655220e323" containerName="oc" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.151327 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ee94aa-72e5-42cb-8829-d2655220e323" containerName="oc" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.151558 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ee94aa-72e5-42cb-8829-d2655220e323" containerName="oc" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.152393 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.157447 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.157716 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.157832 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.165350 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564254-khfb2"] Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.323633 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lzs\" (UniqueName: \"kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs\") pod \"auto-csr-approver-29564254-khfb2\" (UID: \"f3c2ddec-2c0d-40cf-affe-104c66360d2f\") " pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.425776 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lzs\" (UniqueName: \"kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs\") pod \"auto-csr-approver-29564254-khfb2\" (UID: \"f3c2ddec-2c0d-40cf-affe-104c66360d2f\") " pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.444122 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lzs\" (UniqueName: \"kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs\") pod \"auto-csr-approver-29564254-khfb2\" (UID: \"f3c2ddec-2c0d-40cf-affe-104c66360d2f\") " pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.478607 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.932132 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564254-khfb2"] Mar 18 17:34:00 crc kubenswrapper[4939]: I0318 17:34:00.943596 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:34:01 crc kubenswrapper[4939]: I0318 17:34:01.566422 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564254-khfb2" event={"ID":"f3c2ddec-2c0d-40cf-affe-104c66360d2f","Type":"ContainerStarted","Data":"5e39fb41e08e32b39fa59b47c3faa2211fa9ebe4486b5851e5e9c3d3da8a7dd8"} Mar 18 17:34:02 crc kubenswrapper[4939]: I0318 17:34:02.578048 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564254-khfb2" event={"ID":"f3c2ddec-2c0d-40cf-affe-104c66360d2f","Type":"ContainerStarted","Data":"0ef49c9224a2ade33709114c3e6dd825211fa2c970a252bd76088c51e5f12833"} Mar 18 17:34:02 crc kubenswrapper[4939]: I0318 17:34:02.606268 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564254-khfb2" podStartSLOduration=1.360933581 podStartE2EDuration="2.606245642s" podCreationTimestamp="2026-03-18 17:34:00 +0000 UTC" firstStartedPulling="2026-03-18 17:34:00.943317863 +0000 UTC m=+7005.542505484" lastFinishedPulling="2026-03-18 17:34:02.188629924 +0000 UTC m=+7006.787817545" observedRunningTime="2026-03-18 17:34:02.593086904 +0000 UTC m=+7007.192274535" watchObservedRunningTime="2026-03-18 17:34:02.606245642 +0000 UTC m=+7007.205433273" Mar 18 17:34:03 crc kubenswrapper[4939]: I0318 17:34:03.588054 4939 generic.go:334] "Generic (PLEG): container finished" podID="f3c2ddec-2c0d-40cf-affe-104c66360d2f" containerID="0ef49c9224a2ade33709114c3e6dd825211fa2c970a252bd76088c51e5f12833" exitCode=0 Mar 18 17:34:03 crc kubenswrapper[4939]: I0318 17:34:03.588103 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564254-khfb2" event={"ID":"f3c2ddec-2c0d-40cf-affe-104c66360d2f","Type":"ContainerDied","Data":"0ef49c9224a2ade33709114c3e6dd825211fa2c970a252bd76088c51e5f12833"} Mar 18 17:34:04 crc kubenswrapper[4939]: I0318 17:34:04.992744 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.027192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7lzs\" (UniqueName: \"kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs\") pod \"f3c2ddec-2c0d-40cf-affe-104c66360d2f\" (UID: \"f3c2ddec-2c0d-40cf-affe-104c66360d2f\") " Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.048878 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs" (OuterVolumeSpecName: "kube-api-access-k7lzs") pod "f3c2ddec-2c0d-40cf-affe-104c66360d2f" (UID: "f3c2ddec-2c0d-40cf-affe-104c66360d2f"). InnerVolumeSpecName "kube-api-access-k7lzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.129628 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7lzs\" (UniqueName: \"kubernetes.io/projected/f3c2ddec-2c0d-40cf-affe-104c66360d2f-kube-api-access-k7lzs\") on node \"crc\" DevicePath \"\"" Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.608293 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564254-khfb2" event={"ID":"f3c2ddec-2c0d-40cf-affe-104c66360d2f","Type":"ContainerDied","Data":"5e39fb41e08e32b39fa59b47c3faa2211fa9ebe4486b5851e5e9c3d3da8a7dd8"} Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.609699 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e39fb41e08e32b39fa59b47c3faa2211fa9ebe4486b5851e5e9c3d3da8a7dd8" Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.609021 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564254-khfb2" Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.666312 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564248-q4vf4"] Mar 18 17:34:05 crc kubenswrapper[4939]: I0318 17:34:05.674466 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564248-q4vf4"] Mar 18 17:34:06 crc kubenswrapper[4939]: I0318 17:34:06.152190 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17abc635-1d15-48cc-83c6-4d727c0a4f99" path="/var/lib/kubelet/pods/17abc635-1d15-48cc-83c6-4d727c0a4f99/volumes" Mar 18 17:34:09 crc kubenswrapper[4939]: I0318 17:34:09.980618 4939 scope.go:117] "RemoveContainer" containerID="db46dec8aec82f73b2a735c56653047770340dfd44d92584a36dc8500d43f111" Mar 18 17:34:10 crc kubenswrapper[4939]: I0318 17:34:10.055206 4939 scope.go:117] "RemoveContainer" containerID="ba315b549a98ce5b0a248e4759218c94d1b62b221e80b94be11c15c1a0c65bd8" Mar 18 17:35:23 crc kubenswrapper[4939]: I0318 17:35:23.687084 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:35:23 crc kubenswrapper[4939]: I0318 17:35:23.687624 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:35:53 crc kubenswrapper[4939]: I0318 17:35:53.687037 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:35:53 crc kubenswrapper[4939]: I0318 17:35:53.687682 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.169119 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564256-zc59b"] Mar 18 17:36:00 crc kubenswrapper[4939]: E0318 17:36:00.170318 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c2ddec-2c0d-40cf-affe-104c66360d2f" containerName="oc" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.170399 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c2ddec-2c0d-40cf-affe-104c66360d2f" containerName="oc" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.170642 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c2ddec-2c0d-40cf-affe-104c66360d2f" containerName="oc" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.171414 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.173543 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.174977 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.177593 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.189013 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564256-zc59b"] Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.264791 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtxv\" (UniqueName: \"kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv\") pod \"auto-csr-approver-29564256-zc59b\" (UID: \"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4\") " pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.368753 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtxv\" (UniqueName: \"kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv\") pod \"auto-csr-approver-29564256-zc59b\" (UID: \"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4\") " pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.390609 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtxv\" (UniqueName: \"kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv\") pod \"auto-csr-approver-29564256-zc59b\" (UID: \"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4\") " pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.491856 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:00 crc kubenswrapper[4939]: I0318 17:36:00.977860 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564256-zc59b"] Mar 18 17:36:01 crc kubenswrapper[4939]: I0318 17:36:01.722376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564256-zc59b" event={"ID":"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4","Type":"ContainerStarted","Data":"d4a2e580c690322834506150073bb5718199ed1d957fb25e40c376eaa4edddb8"} Mar 18 17:36:02 crc kubenswrapper[4939]: I0318 17:36:02.733062 4939 generic.go:334] "Generic (PLEG): container finished" podID="0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" containerID="ef791a4a3d48426c8760e7a173541b3e8890d02e7566be1cbf69fd826a56e885" exitCode=0 Mar 18 17:36:02 crc kubenswrapper[4939]: I0318 17:36:02.733111 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564256-zc59b" event={"ID":"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4","Type":"ContainerDied","Data":"ef791a4a3d48426c8760e7a173541b3e8890d02e7566be1cbf69fd826a56e885"} Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.137615 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.154335 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtxv\" (UniqueName: \"kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv\") pod \"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4\" (UID: \"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4\") " Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.159743 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv" (OuterVolumeSpecName: "kube-api-access-dgtxv") pod "0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" (UID: "0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4"). InnerVolumeSpecName "kube-api-access-dgtxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.257561 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtxv\" (UniqueName: \"kubernetes.io/projected/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4-kube-api-access-dgtxv\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.755180 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564256-zc59b" event={"ID":"0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4","Type":"ContainerDied","Data":"d4a2e580c690322834506150073bb5718199ed1d957fb25e40c376eaa4edddb8"} Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.755454 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a2e580c690322834506150073bb5718199ed1d957fb25e40c376eaa4edddb8" Mar 18 17:36:04 crc kubenswrapper[4939]: I0318 17:36:04.755222 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564256-zc59b" Mar 18 17:36:05 crc kubenswrapper[4939]: I0318 17:36:05.204925 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564250-cllmx"] Mar 18 17:36:05 crc kubenswrapper[4939]: I0318 17:36:05.213818 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564250-cllmx"] Mar 18 17:36:06 crc kubenswrapper[4939]: I0318 17:36:06.149238 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36e7d38-a4d7-443b-a5b4-8e18d037476a" path="/var/lib/kubelet/pods/b36e7d38-a4d7-443b-a5b4-8e18d037476a/volumes" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.187760 4939 scope.go:117] "RemoveContainer" containerID="b50ad7e4106325b5ac7c87fabb559eaa3075126fd28e82cbcc276bbe7ad92bb5" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.336121 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:10 crc kubenswrapper[4939]: E0318 17:36:10.336848 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" containerName="oc" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.336870 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" containerName="oc" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.337097 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" containerName="oc" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.338578 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.358947 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.407736 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.408005 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.408280 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk22j\" (UniqueName: \"kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.510392 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk22j\" (UniqueName: \"kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.510559 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.510683 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.511146 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.511402 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.530969 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk22j\" (UniqueName: \"kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j\") pod \"redhat-marketplace-tqsd4\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:10 crc kubenswrapper[4939]: I0318 17:36:10.675676 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:11 crc kubenswrapper[4939]: I0318 17:36:11.188289 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:11 crc kubenswrapper[4939]: I0318 17:36:11.821778 4939 generic.go:334] "Generic (PLEG): container finished" podID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerID="2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34" exitCode=0 Mar 18 17:36:11 crc kubenswrapper[4939]: I0318 17:36:11.822053 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerDied","Data":"2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34"} Mar 18 17:36:11 crc kubenswrapper[4939]: I0318 17:36:11.822081 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerStarted","Data":"0b17b3bb3565767516d8d8346567c047010fd0800143e56f911e714186f6c9ed"} Mar 18 17:36:13 crc kubenswrapper[4939]: I0318 17:36:13.866864 4939 generic.go:334] "Generic (PLEG): container finished" podID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerID="bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14" exitCode=0 Mar 18 17:36:13 crc kubenswrapper[4939]: I0318 17:36:13.867797 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerDied","Data":"bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14"} Mar 18 17:36:14 crc kubenswrapper[4939]: I0318 17:36:14.887553 4939 generic.go:334] "Generic (PLEG): container finished" podID="e8465fa8-f589-4135-b6bf-f278436d5326" containerID="7f2ddba3e031ebc985b7a6b065a6688162bf3c2f0dded49bf2fd55469dec0e1f" exitCode=0 Mar 18 17:36:14 crc kubenswrapper[4939]: I0318 17:36:14.887644 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" event={"ID":"e8465fa8-f589-4135-b6bf-f278436d5326","Type":"ContainerDied","Data":"7f2ddba3e031ebc985b7a6b065a6688162bf3c2f0dded49bf2fd55469dec0e1f"} Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.437640 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.571822 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph\") pod \"e8465fa8-f589-4135-b6bf-f278436d5326\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.571900 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle\") pod \"e8465fa8-f589-4135-b6bf-f278436d5326\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.571931 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory\") pod \"e8465fa8-f589-4135-b6bf-f278436d5326\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.572009 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwvv4\" (UniqueName: \"kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4\") pod \"e8465fa8-f589-4135-b6bf-f278436d5326\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.572052 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1\") pod \"e8465fa8-f589-4135-b6bf-f278436d5326\" (UID: \"e8465fa8-f589-4135-b6bf-f278436d5326\") " Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.577294 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4" (OuterVolumeSpecName: "kube-api-access-vwvv4") pod "e8465fa8-f589-4135-b6bf-f278436d5326" (UID: "e8465fa8-f589-4135-b6bf-f278436d5326"). InnerVolumeSpecName "kube-api-access-vwvv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.577933 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "e8465fa8-f589-4135-b6bf-f278436d5326" (UID: "e8465fa8-f589-4135-b6bf-f278436d5326"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.581774 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph" (OuterVolumeSpecName: "ceph") pod "e8465fa8-f589-4135-b6bf-f278436d5326" (UID: "e8465fa8-f589-4135-b6bf-f278436d5326"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.600391 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e8465fa8-f589-4135-b6bf-f278436d5326" (UID: "e8465fa8-f589-4135-b6bf-f278436d5326"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.605411 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory" (OuterVolumeSpecName: "inventory") pod "e8465fa8-f589-4135-b6bf-f278436d5326" (UID: "e8465fa8-f589-4135-b6bf-f278436d5326"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.674479 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.674536 4939 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.674550 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.674565 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwvv4\" (UniqueName: \"kubernetes.io/projected/e8465fa8-f589-4135-b6bf-f278436d5326-kube-api-access-vwvv4\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.674579 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e8465fa8-f589-4135-b6bf-f278436d5326-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.906914 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" event={"ID":"e8465fa8-f589-4135-b6bf-f278436d5326","Type":"ContainerDied","Data":"f885f9f25ed34c95c98179feb793375d3536bb1b55bb99bd8dff33a21f4f36fb"} Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.906959 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f885f9f25ed34c95c98179feb793375d3536bb1b55bb99bd8dff33a21f4f36fb" Mar 18 17:36:16 crc kubenswrapper[4939]: I0318 17:36:16.906960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5" Mar 18 17:36:18 crc kubenswrapper[4939]: I0318 17:36:18.938673 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerStarted","Data":"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5"} Mar 18 17:36:18 crc kubenswrapper[4939]: I0318 17:36:18.971389 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tqsd4" podStartSLOduration=2.496798515 podStartE2EDuration="8.97136512s" podCreationTimestamp="2026-03-18 17:36:10 +0000 UTC" firstStartedPulling="2026-03-18 17:36:11.823258263 +0000 UTC m=+7136.422445884" lastFinishedPulling="2026-03-18 17:36:18.297824868 +0000 UTC m=+7142.897012489" observedRunningTime="2026-03-18 17:36:18.959838181 +0000 UTC m=+7143.559025802" watchObservedRunningTime="2026-03-18 17:36:18.97136512 +0000 UTC m=+7143.570552741" Mar 18 17:36:20 crc kubenswrapper[4939]: I0318 17:36:20.676485 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:20 crc kubenswrapper[4939]: I0318 17:36:20.677030 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:20 crc kubenswrapper[4939]: I0318 17:36:20.729117 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.687226 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.687594 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.687674 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.688603 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.688872 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755" gracePeriod=600 Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.727959 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tk5r5"] Mar 18 17:36:23 crc kubenswrapper[4939]: E0318 17:36:23.728585 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8465fa8-f589-4135-b6bf-f278436d5326" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.728614 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8465fa8-f589-4135-b6bf-f278436d5326" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.728902 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8465fa8-f589-4135-b6bf-f278436d5326" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.729829 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.733216 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.733814 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.733980 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.734785 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.742547 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tk5r5"] Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.832726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mz8x\" (UniqueName: \"kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.832921 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.833094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.833127 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.833421 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.935200 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.935343 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mz8x\" (UniqueName: \"kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.935808 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.935873 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.935902 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.943209 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.944150 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.945289 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.945294 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.957439 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mz8x\" (UniqueName: \"kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x\") pod \"bootstrap-openstack-openstack-cell1-tk5r5\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.992417 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755" exitCode=0 Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.992463 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755"} Mar 18 17:36:23 crc kubenswrapper[4939]: I0318 17:36:23.992521 4939 scope.go:117] "RemoveContainer" containerID="eca06e2be66dc702fe96f70817fdd74410ef4e4e9978f413b11ca55050a29115" Mar 18 17:36:24 crc kubenswrapper[4939]: I0318 17:36:24.053830 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:36:24 crc kubenswrapper[4939]: I0318 17:36:24.652312 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tk5r5"] Mar 18 17:36:25 crc kubenswrapper[4939]: I0318 17:36:25.003826 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" event={"ID":"44675ae9-2f87-4dff-bc11-602ed205461b","Type":"ContainerStarted","Data":"ca357efd1f84fd3f5acf46c659a41c80a57968987509f3b10a2ae30af9014ca5"} Mar 18 17:36:25 crc kubenswrapper[4939]: I0318 17:36:25.006769 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73"} Mar 18 17:36:26 crc kubenswrapper[4939]: I0318 17:36:26.017727 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" event={"ID":"44675ae9-2f87-4dff-bc11-602ed205461b","Type":"ContainerStarted","Data":"ca86860614465c30a950bd15de0cb0e0469f921944acaccc09043daf35c6d11d"} Mar 18 17:36:26 crc kubenswrapper[4939]: I0318 17:36:26.041865 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" podStartSLOduration=2.862063016 podStartE2EDuration="3.041842137s" podCreationTimestamp="2026-03-18 17:36:23 +0000 UTC" firstStartedPulling="2026-03-18 17:36:24.654845606 +0000 UTC m=+7149.254033227" lastFinishedPulling="2026-03-18 17:36:24.834624727 +0000 UTC m=+7149.433812348" observedRunningTime="2026-03-18 17:36:26.036874357 +0000 UTC m=+7150.636061998" watchObservedRunningTime="2026-03-18 17:36:26.041842137 +0000 UTC m=+7150.641029768" Mar 18 17:36:30 crc kubenswrapper[4939]: I0318 17:36:30.734585 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:30 crc kubenswrapper[4939]: I0318 17:36:30.793208 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.081345 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tqsd4" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="registry-server" containerID="cri-o://cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5" gracePeriod=2 Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.651396 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.703630 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk22j\" (UniqueName: \"kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j\") pod \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.704146 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content\") pod \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.704217 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities\") pod \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\" (UID: \"ae446d3e-31a9-4ddf-be1a-584bbcaae99c\") " Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.705388 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities" (OuterVolumeSpecName: "utilities") pod "ae446d3e-31a9-4ddf-be1a-584bbcaae99c" (UID: "ae446d3e-31a9-4ddf-be1a-584bbcaae99c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.709492 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j" (OuterVolumeSpecName: "kube-api-access-bk22j") pod "ae446d3e-31a9-4ddf-be1a-584bbcaae99c" (UID: "ae446d3e-31a9-4ddf-be1a-584bbcaae99c"). InnerVolumeSpecName "kube-api-access-bk22j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.735158 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae446d3e-31a9-4ddf-be1a-584bbcaae99c" (UID: "ae446d3e-31a9-4ddf-be1a-584bbcaae99c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.806736 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk22j\" (UniqueName: \"kubernetes.io/projected/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-kube-api-access-bk22j\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.806770 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:31 crc kubenswrapper[4939]: I0318 17:36:31.806780 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae446d3e-31a9-4ddf-be1a-584bbcaae99c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.091833 4939 generic.go:334] "Generic (PLEG): container finished" podID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerID="cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5" exitCode=0 Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.091873 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerDied","Data":"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5"} Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.091897 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqsd4" event={"ID":"ae446d3e-31a9-4ddf-be1a-584bbcaae99c","Type":"ContainerDied","Data":"0b17b3bb3565767516d8d8346567c047010fd0800143e56f911e714186f6c9ed"} Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.091912 4939 scope.go:117] "RemoveContainer" containerID="cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.092019 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqsd4" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.122913 4939 scope.go:117] "RemoveContainer" containerID="bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.129186 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.145868 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqsd4"] Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.149316 4939 scope.go:117] "RemoveContainer" containerID="2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.191634 4939 scope.go:117] "RemoveContainer" containerID="cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5" Mar 18 17:36:32 crc kubenswrapper[4939]: E0318 17:36:32.192283 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5\": container with ID starting with cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5 not found: ID does not exist" containerID="cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.192340 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5"} err="failed to get container status \"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5\": rpc error: code = NotFound desc = could not find container \"cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5\": container with ID starting with cf1646054cc008624ee802d017839f049905ff810282cfce2e806e24717a8ed5 not found: ID does not exist" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.192376 4939 scope.go:117] "RemoveContainer" containerID="bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14" Mar 18 17:36:32 crc kubenswrapper[4939]: E0318 17:36:32.193177 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14\": container with ID starting with bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14 not found: ID does not exist" containerID="bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.193230 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14"} err="failed to get container status \"bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14\": rpc error: code = NotFound desc = could not find container \"bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14\": container with ID starting with bd903a45633efeed06a4604fb378118219f0250c006bd6d7a9ab05295161aa14 not found: ID does not exist" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.193263 4939 scope.go:117] "RemoveContainer" containerID="2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34" Mar 18 17:36:32 crc kubenswrapper[4939]: E0318 17:36:32.193781 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34\": container with ID starting with 2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34 not found: ID does not exist" containerID="2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34" Mar 18 17:36:32 crc kubenswrapper[4939]: I0318 17:36:32.193808 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34"} err="failed to get container status \"2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34\": rpc error: code = NotFound desc = could not find container \"2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34\": container with ID starting with 2322317981b3bf46f926069e5584840231eeb915cfe0d38d299080831a0e3b34 not found: ID does not exist" Mar 18 17:36:34 crc kubenswrapper[4939]: I0318 17:36:34.151065 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" path="/var/lib/kubelet/pods/ae446d3e-31a9-4ddf-be1a-584bbcaae99c/volumes" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.152875 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564258-fbq82"] Mar 18 17:38:00 crc kubenswrapper[4939]: E0318 17:38:00.153828 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="registry-server" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.153847 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="registry-server" Mar 18 17:38:00 crc kubenswrapper[4939]: E0318 17:38:00.153859 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="extract-content" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.153868 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="extract-content" Mar 18 17:38:00 crc kubenswrapper[4939]: E0318 17:38:00.153888 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="extract-utilities" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.153897 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="extract-utilities" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.154224 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae446d3e-31a9-4ddf-be1a-584bbcaae99c" containerName="registry-server" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.155194 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.156545 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564258-fbq82"] Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.163717 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.165732 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.165794 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.271885 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc78j\" (UniqueName: \"kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j\") pod \"auto-csr-approver-29564258-fbq82\" (UID: \"b425342a-9f1f-4688-83c0-6a38872386e8\") " pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.374481 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc78j\" (UniqueName: \"kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j\") pod \"auto-csr-approver-29564258-fbq82\" (UID: \"b425342a-9f1f-4688-83c0-6a38872386e8\") " pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.400115 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc78j\" (UniqueName: \"kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j\") pod \"auto-csr-approver-29564258-fbq82\" (UID: \"b425342a-9f1f-4688-83c0-6a38872386e8\") " pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.483854 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:00 crc kubenswrapper[4939]: I0318 17:38:00.952637 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564258-fbq82"] Mar 18 17:38:01 crc kubenswrapper[4939]: I0318 17:38:01.039881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564258-fbq82" event={"ID":"b425342a-9f1f-4688-83c0-6a38872386e8","Type":"ContainerStarted","Data":"63b75ac9fa9461ccf54b3f5bd6ec9e49a9607cb6caab7d06923e961ca69b4300"} Mar 18 17:38:07 crc kubenswrapper[4939]: I0318 17:38:07.108336 4939 generic.go:334] "Generic (PLEG): container finished" podID="b425342a-9f1f-4688-83c0-6a38872386e8" containerID="569b5c2d2eadc1656949f726e482cd35a89d8d015e385700a18f8f0d1bd37083" exitCode=0 Mar 18 17:38:07 crc kubenswrapper[4939]: I0318 17:38:07.108395 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564258-fbq82" event={"ID":"b425342a-9f1f-4688-83c0-6a38872386e8","Type":"ContainerDied","Data":"569b5c2d2eadc1656949f726e482cd35a89d8d015e385700a18f8f0d1bd37083"} Mar 18 17:38:08 crc kubenswrapper[4939]: I0318 17:38:08.552288 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:08 crc kubenswrapper[4939]: I0318 17:38:08.564743 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc78j\" (UniqueName: \"kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j\") pod \"b425342a-9f1f-4688-83c0-6a38872386e8\" (UID: \"b425342a-9f1f-4688-83c0-6a38872386e8\") " Mar 18 17:38:08 crc kubenswrapper[4939]: I0318 17:38:08.572013 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j" (OuterVolumeSpecName: "kube-api-access-hc78j") pod "b425342a-9f1f-4688-83c0-6a38872386e8" (UID: "b425342a-9f1f-4688-83c0-6a38872386e8"). InnerVolumeSpecName "kube-api-access-hc78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:38:08 crc kubenswrapper[4939]: I0318 17:38:08.668426 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc78j\" (UniqueName: \"kubernetes.io/projected/b425342a-9f1f-4688-83c0-6a38872386e8-kube-api-access-hc78j\") on node \"crc\" DevicePath \"\"" Mar 18 17:38:09 crc kubenswrapper[4939]: I0318 17:38:09.147480 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564258-fbq82" event={"ID":"b425342a-9f1f-4688-83c0-6a38872386e8","Type":"ContainerDied","Data":"63b75ac9fa9461ccf54b3f5bd6ec9e49a9607cb6caab7d06923e961ca69b4300"} Mar 18 17:38:09 crc kubenswrapper[4939]: I0318 17:38:09.147807 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63b75ac9fa9461ccf54b3f5bd6ec9e49a9607cb6caab7d06923e961ca69b4300" Mar 18 17:38:09 crc kubenswrapper[4939]: I0318 17:38:09.147565 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564258-fbq82" Mar 18 17:38:09 crc kubenswrapper[4939]: I0318 17:38:09.636560 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564252-kwr9s"] Mar 18 17:38:09 crc kubenswrapper[4939]: I0318 17:38:09.644871 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564252-kwr9s"] Mar 18 17:38:10 crc kubenswrapper[4939]: I0318 17:38:10.157025 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ee94aa-72e5-42cb-8829-d2655220e323" path="/var/lib/kubelet/pods/b9ee94aa-72e5-42cb-8829-d2655220e323/volumes" Mar 18 17:38:10 crc kubenswrapper[4939]: I0318 17:38:10.323436 4939 scope.go:117] "RemoveContainer" containerID="09103d1ec358baefe85ad2abd8f2e9ad669d71046163db755938c035c38e8185" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.426283 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:38 crc kubenswrapper[4939]: E0318 17:38:38.427409 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b425342a-9f1f-4688-83c0-6a38872386e8" containerName="oc" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.427426 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b425342a-9f1f-4688-83c0-6a38872386e8" containerName="oc" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.427705 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b425342a-9f1f-4688-83c0-6a38872386e8" containerName="oc" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.431816 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.438451 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.504672 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctnc\" (UniqueName: \"kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.504804 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.504860 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.607108 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctnc\" (UniqueName: \"kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.607765 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.608210 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.608280 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.608641 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.627132 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctnc\" (UniqueName: \"kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc\") pod \"certified-operators-ksqhh\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:38 crc kubenswrapper[4939]: I0318 17:38:38.754257 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:39 crc kubenswrapper[4939]: I0318 17:38:39.292305 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:39 crc kubenswrapper[4939]: I0318 17:38:39.546602 4939 generic.go:334] "Generic (PLEG): container finished" podID="25ed52f6-392f-45ab-97cb-601d43174273" containerID="4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12" exitCode=0 Mar 18 17:38:39 crc kubenswrapper[4939]: I0318 17:38:39.547708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerDied","Data":"4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12"} Mar 18 17:38:39 crc kubenswrapper[4939]: I0318 17:38:39.547806 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerStarted","Data":"6c867c764ef0ea0cd7e826c0e5554d021bcc946d939067ab30ef7deb64328720"} Mar 18 17:38:41 crc kubenswrapper[4939]: I0318 17:38:41.575909 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerStarted","Data":"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd"} Mar 18 17:38:42 crc kubenswrapper[4939]: I0318 17:38:42.588697 4939 generic.go:334] "Generic (PLEG): container finished" podID="25ed52f6-392f-45ab-97cb-601d43174273" containerID="1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd" exitCode=0 Mar 18 17:38:42 crc kubenswrapper[4939]: I0318 17:38:42.588766 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerDied","Data":"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd"} Mar 18 17:38:43 crc kubenswrapper[4939]: I0318 17:38:43.599818 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerStarted","Data":"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264"} Mar 18 17:38:43 crc kubenswrapper[4939]: I0318 17:38:43.623441 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ksqhh" podStartSLOduration=1.884216367 podStartE2EDuration="5.623419606s" podCreationTimestamp="2026-03-18 17:38:38 +0000 UTC" firstStartedPulling="2026-03-18 17:38:39.549483375 +0000 UTC m=+7284.148670996" lastFinishedPulling="2026-03-18 17:38:43.288686614 +0000 UTC m=+7287.887874235" observedRunningTime="2026-03-18 17:38:43.615557349 +0000 UTC m=+7288.214744990" watchObservedRunningTime="2026-03-18 17:38:43.623419606 +0000 UTC m=+7288.222607237" Mar 18 17:38:48 crc kubenswrapper[4939]: I0318 17:38:48.755003 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:48 crc kubenswrapper[4939]: I0318 17:38:48.755396 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:48 crc kubenswrapper[4939]: I0318 17:38:48.805425 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:49 crc kubenswrapper[4939]: I0318 17:38:49.705206 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:49 crc kubenswrapper[4939]: I0318 17:38:49.747732 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:51 crc kubenswrapper[4939]: I0318 17:38:51.672941 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ksqhh" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="registry-server" containerID="cri-o://cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264" gracePeriod=2 Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.237900 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.349454 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content\") pod \"25ed52f6-392f-45ab-97cb-601d43174273\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.349581 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities\") pod \"25ed52f6-392f-45ab-97cb-601d43174273\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.349659 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mctnc\" (UniqueName: \"kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc\") pod \"25ed52f6-392f-45ab-97cb-601d43174273\" (UID: \"25ed52f6-392f-45ab-97cb-601d43174273\") " Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.350559 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities" (OuterVolumeSpecName: "utilities") pod "25ed52f6-392f-45ab-97cb-601d43174273" (UID: "25ed52f6-392f-45ab-97cb-601d43174273"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.356765 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc" (OuterVolumeSpecName: "kube-api-access-mctnc") pod "25ed52f6-392f-45ab-97cb-601d43174273" (UID: "25ed52f6-392f-45ab-97cb-601d43174273"). InnerVolumeSpecName "kube-api-access-mctnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.407745 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25ed52f6-392f-45ab-97cb-601d43174273" (UID: "25ed52f6-392f-45ab-97cb-601d43174273"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.452408 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.452463 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25ed52f6-392f-45ab-97cb-601d43174273-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.452480 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mctnc\" (UniqueName: \"kubernetes.io/projected/25ed52f6-392f-45ab-97cb-601d43174273-kube-api-access-mctnc\") on node \"crc\" DevicePath \"\"" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.688831 4939 generic.go:334] "Generic (PLEG): container finished" podID="25ed52f6-392f-45ab-97cb-601d43174273" containerID="cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264" exitCode=0 Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.688881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerDied","Data":"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264"} Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.688912 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksqhh" event={"ID":"25ed52f6-392f-45ab-97cb-601d43174273","Type":"ContainerDied","Data":"6c867c764ef0ea0cd7e826c0e5554d021bcc946d939067ab30ef7deb64328720"} Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.688935 4939 scope.go:117] "RemoveContainer" containerID="cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.689019 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksqhh" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.722228 4939 scope.go:117] "RemoveContainer" containerID="1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.767458 4939 scope.go:117] "RemoveContainer" containerID="4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.790444 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.799645 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ksqhh"] Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.844536 4939 scope.go:117] "RemoveContainer" containerID="cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264" Mar 18 17:38:52 crc kubenswrapper[4939]: E0318 17:38:52.845013 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264\": container with ID starting with cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264 not found: ID does not exist" containerID="cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.845172 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264"} err="failed to get container status \"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264\": rpc error: code = NotFound desc = could not find container \"cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264\": container with ID starting with cbfde249b8fe0b185c2c8e83d58be999984fce63decb017ffdbc40311237b264 not found: ID does not exist" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.845303 4939 scope.go:117] "RemoveContainer" containerID="1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd" Mar 18 17:38:52 crc kubenswrapper[4939]: E0318 17:38:52.845772 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd\": container with ID starting with 1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd not found: ID does not exist" containerID="1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.845888 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd"} err="failed to get container status \"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd\": rpc error: code = NotFound desc = could not find container \"1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd\": container with ID starting with 1b7214ba2c86631646aed559458415ee8189b3cc80f435836559345134ffc5dd not found: ID does not exist" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.845978 4939 scope.go:117] "RemoveContainer" containerID="4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12" Mar 18 17:38:52 crc kubenswrapper[4939]: E0318 17:38:52.846316 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12\": container with ID starting with 4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12 not found: ID does not exist" containerID="4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12" Mar 18 17:38:52 crc kubenswrapper[4939]: I0318 17:38:52.846441 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12"} err="failed to get container status \"4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12\": rpc error: code = NotFound desc = could not find container \"4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12\": container with ID starting with 4b42594e52ce7463d86eea19a959ca352aa6c033460feaa42421bd1864218b12 not found: ID does not exist" Mar 18 17:38:53 crc kubenswrapper[4939]: I0318 17:38:53.686988 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:38:53 crc kubenswrapper[4939]: I0318 17:38:53.687312 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:38:54 crc kubenswrapper[4939]: I0318 17:38:54.146927 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ed52f6-392f-45ab-97cb-601d43174273" path="/var/lib/kubelet/pods/25ed52f6-392f-45ab-97cb-601d43174273/volumes" Mar 18 17:39:23 crc kubenswrapper[4939]: I0318 17:39:23.688020 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:39:23 crc kubenswrapper[4939]: I0318 17:39:23.689787 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:39:34 crc kubenswrapper[4939]: I0318 17:39:34.108970 4939 generic.go:334] "Generic (PLEG): container finished" podID="44675ae9-2f87-4dff-bc11-602ed205461b" containerID="ca86860614465c30a950bd15de0cb0e0469f921944acaccc09043daf35c6d11d" exitCode=0 Mar 18 17:39:34 crc kubenswrapper[4939]: I0318 17:39:34.109058 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" event={"ID":"44675ae9-2f87-4dff-bc11-602ed205461b","Type":"ContainerDied","Data":"ca86860614465c30a950bd15de0cb0e0469f921944acaccc09043daf35c6d11d"} Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.633220 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.716072 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle\") pod \"44675ae9-2f87-4dff-bc11-602ed205461b\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.716107 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1\") pod \"44675ae9-2f87-4dff-bc11-602ed205461b\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.716143 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mz8x\" (UniqueName: \"kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x\") pod \"44675ae9-2f87-4dff-bc11-602ed205461b\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.716241 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory\") pod \"44675ae9-2f87-4dff-bc11-602ed205461b\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.716273 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph\") pod \"44675ae9-2f87-4dff-bc11-602ed205461b\" (UID: \"44675ae9-2f87-4dff-bc11-602ed205461b\") " Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.721297 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph" (OuterVolumeSpecName: "ceph") pod "44675ae9-2f87-4dff-bc11-602ed205461b" (UID: "44675ae9-2f87-4dff-bc11-602ed205461b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.721695 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "44675ae9-2f87-4dff-bc11-602ed205461b" (UID: "44675ae9-2f87-4dff-bc11-602ed205461b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.721775 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x" (OuterVolumeSpecName: "kube-api-access-6mz8x") pod "44675ae9-2f87-4dff-bc11-602ed205461b" (UID: "44675ae9-2f87-4dff-bc11-602ed205461b"). InnerVolumeSpecName "kube-api-access-6mz8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.745896 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory" (OuterVolumeSpecName: "inventory") pod "44675ae9-2f87-4dff-bc11-602ed205461b" (UID: "44675ae9-2f87-4dff-bc11-602ed205461b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.748539 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "44675ae9-2f87-4dff-bc11-602ed205461b" (UID: "44675ae9-2f87-4dff-bc11-602ed205461b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.820111 4939 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.820348 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.820418 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mz8x\" (UniqueName: \"kubernetes.io/projected/44675ae9-2f87-4dff-bc11-602ed205461b-kube-api-access-6mz8x\") on node \"crc\" DevicePath \"\"" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.820474 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:39:35 crc kubenswrapper[4939]: I0318 17:39:35.820563 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44675ae9-2f87-4dff-bc11-602ed205461b-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.139882 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.150577 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tk5r5" event={"ID":"44675ae9-2f87-4dff-bc11-602ed205461b","Type":"ContainerDied","Data":"ca357efd1f84fd3f5acf46c659a41c80a57968987509f3b10a2ae30af9014ca5"} Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.150637 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca357efd1f84fd3f5acf46c659a41c80a57968987509f3b10a2ae30af9014ca5" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.221725 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fs8vm"] Mar 18 17:39:36 crc kubenswrapper[4939]: E0318 17:39:36.222417 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="registry-server" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.222560 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="registry-server" Mar 18 17:39:36 crc kubenswrapper[4939]: E0318 17:39:36.222642 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="extract-content" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.222711 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="extract-content" Mar 18 17:39:36 crc kubenswrapper[4939]: E0318 17:39:36.222841 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="extract-utilities" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.222911 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="extract-utilities" Mar 18 17:39:36 crc kubenswrapper[4939]: E0318 17:39:36.223000 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44675ae9-2f87-4dff-bc11-602ed205461b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.223078 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="44675ae9-2f87-4dff-bc11-602ed205461b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.223385 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ed52f6-392f-45ab-97cb-601d43174273" containerName="registry-server" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.223519 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="44675ae9-2f87-4dff-bc11-602ed205461b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.224595 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.230345 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.230666 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.230807 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.231131 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.234294 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fs8vm"] Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.329479 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.329920 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcft8\" (UniqueName: \"kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.330143 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.330258 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.432475 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.432848 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.432935 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.433060 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcft8\" (UniqueName: \"kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.436372 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.437081 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.437307 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.447670 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcft8\" (UniqueName: \"kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8\") pod \"download-cache-openstack-openstack-cell1-fs8vm\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:36 crc kubenswrapper[4939]: I0318 17:39:36.538986 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:39:37 crc kubenswrapper[4939]: I0318 17:39:37.078812 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fs8vm"] Mar 18 17:39:37 crc kubenswrapper[4939]: I0318 17:39:37.081110 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:39:37 crc kubenswrapper[4939]: I0318 17:39:37.144070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" event={"ID":"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a","Type":"ContainerStarted","Data":"12d23f23bd9c9f533a2e53b766955543734e1bde86fa7711285d41b221de7754"} Mar 18 17:39:38 crc kubenswrapper[4939]: I0318 17:39:38.153940 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" event={"ID":"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a","Type":"ContainerStarted","Data":"0d270b9705693f5ac63a7b00714f68301847001150e54415609779002f628455"} Mar 18 17:39:38 crc kubenswrapper[4939]: I0318 17:39:38.181198 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" podStartSLOduration=1.9832725180000002 podStartE2EDuration="2.181171443s" podCreationTimestamp="2026-03-18 17:39:36 +0000 UTC" firstStartedPulling="2026-03-18 17:39:37.080847242 +0000 UTC m=+7341.680034863" lastFinishedPulling="2026-03-18 17:39:37.278746147 +0000 UTC m=+7341.877933788" observedRunningTime="2026-03-18 17:39:38.17339643 +0000 UTC m=+7342.772584061" watchObservedRunningTime="2026-03-18 17:39:38.181171443 +0000 UTC m=+7342.780359104" Mar 18 17:39:53 crc kubenswrapper[4939]: I0318 17:39:53.687128 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:39:53 crc kubenswrapper[4939]: I0318 17:39:53.687590 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:39:53 crc kubenswrapper[4939]: I0318 17:39:53.687632 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:39:53 crc kubenswrapper[4939]: I0318 17:39:53.688382 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:39:53 crc kubenswrapper[4939]: I0318 17:39:53.688425 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" gracePeriod=600 Mar 18 17:39:53 crc kubenswrapper[4939]: E0318 17:39:53.820060 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:39:54 crc kubenswrapper[4939]: I0318 17:39:54.325826 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" exitCode=0 Mar 18 17:39:54 crc kubenswrapper[4939]: I0318 17:39:54.326110 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73"} Mar 18 17:39:54 crc kubenswrapper[4939]: I0318 17:39:54.326164 4939 scope.go:117] "RemoveContainer" containerID="4f04c4546d8bda34fa5fcebc419a15e6b503f4f824dddf0115d6cc84bb5f6755" Mar 18 17:39:54 crc kubenswrapper[4939]: I0318 17:39:54.326685 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:39:54 crc kubenswrapper[4939]: E0318 17:39:54.334648 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.130000 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564260-6jszz"] Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.133495 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.135929 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.135945 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.136071 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.147042 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564260-6jszz"] Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.248638 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9brx\" (UniqueName: \"kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx\") pod \"auto-csr-approver-29564260-6jszz\" (UID: \"e43ce6b1-fa53-4c37-b049-2057fcc2542f\") " pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.350328 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9brx\" (UniqueName: \"kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx\") pod \"auto-csr-approver-29564260-6jszz\" (UID: \"e43ce6b1-fa53-4c37-b049-2057fcc2542f\") " pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.370144 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9brx\" (UniqueName: \"kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx\") pod \"auto-csr-approver-29564260-6jszz\" (UID: \"e43ce6b1-fa53-4c37-b049-2057fcc2542f\") " pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:00 crc kubenswrapper[4939]: I0318 17:40:00.453691 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:01 crc kubenswrapper[4939]: I0318 17:40:01.210994 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564260-6jszz"] Mar 18 17:40:01 crc kubenswrapper[4939]: I0318 17:40:01.405533 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564260-6jszz" event={"ID":"e43ce6b1-fa53-4c37-b049-2057fcc2542f","Type":"ContainerStarted","Data":"a3a4c91acd09f62e26645d16314486ce8f9de7984639a7158ee46c97dfed5150"} Mar 18 17:40:03 crc kubenswrapper[4939]: I0318 17:40:03.423642 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564260-6jszz" event={"ID":"e43ce6b1-fa53-4c37-b049-2057fcc2542f","Type":"ContainerStarted","Data":"0a462c6e4ca3a134fa034fe2473fa84036804ec4678b3f7ec873fcfe1407905a"} Mar 18 17:40:03 crc kubenswrapper[4939]: I0318 17:40:03.461780 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564260-6jszz" podStartSLOduration=1.9417155849999999 podStartE2EDuration="3.461758527s" podCreationTimestamp="2026-03-18 17:40:00 +0000 UTC" firstStartedPulling="2026-03-18 17:40:01.216196903 +0000 UTC m=+7365.815384524" lastFinishedPulling="2026-03-18 17:40:02.736239845 +0000 UTC m=+7367.335427466" observedRunningTime="2026-03-18 17:40:03.450357291 +0000 UTC m=+7368.049544912" watchObservedRunningTime="2026-03-18 17:40:03.461758527 +0000 UTC m=+7368.060946228" Mar 18 17:40:04 crc kubenswrapper[4939]: I0318 17:40:04.435585 4939 generic.go:334] "Generic (PLEG): container finished" podID="e43ce6b1-fa53-4c37-b049-2057fcc2542f" containerID="0a462c6e4ca3a134fa034fe2473fa84036804ec4678b3f7ec873fcfe1407905a" exitCode=0 Mar 18 17:40:04 crc kubenswrapper[4939]: I0318 17:40:04.435726 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564260-6jszz" event={"ID":"e43ce6b1-fa53-4c37-b049-2057fcc2542f","Type":"ContainerDied","Data":"0a462c6e4ca3a134fa034fe2473fa84036804ec4678b3f7ec873fcfe1407905a"} Mar 18 17:40:05 crc kubenswrapper[4939]: I0318 17:40:05.814494 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:05 crc kubenswrapper[4939]: I0318 17:40:05.971032 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9brx\" (UniqueName: \"kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx\") pod \"e43ce6b1-fa53-4c37-b049-2057fcc2542f\" (UID: \"e43ce6b1-fa53-4c37-b049-2057fcc2542f\") " Mar 18 17:40:05 crc kubenswrapper[4939]: I0318 17:40:05.976778 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx" (OuterVolumeSpecName: "kube-api-access-g9brx") pod "e43ce6b1-fa53-4c37-b049-2057fcc2542f" (UID: "e43ce6b1-fa53-4c37-b049-2057fcc2542f"). InnerVolumeSpecName "kube-api-access-g9brx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.073313 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9brx\" (UniqueName: \"kubernetes.io/projected/e43ce6b1-fa53-4c37-b049-2057fcc2542f-kube-api-access-g9brx\") on node \"crc\" DevicePath \"\"" Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.459867 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564260-6jszz" event={"ID":"e43ce6b1-fa53-4c37-b049-2057fcc2542f","Type":"ContainerDied","Data":"a3a4c91acd09f62e26645d16314486ce8f9de7984639a7158ee46c97dfed5150"} Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.459913 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a4c91acd09f62e26645d16314486ce8f9de7984639a7158ee46c97dfed5150" Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.459928 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564260-6jszz" Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.519131 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564254-khfb2"] Mar 18 17:40:06 crc kubenswrapper[4939]: I0318 17:40:06.537099 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564254-khfb2"] Mar 18 17:40:08 crc kubenswrapper[4939]: I0318 17:40:08.134093 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:40:08 crc kubenswrapper[4939]: E0318 17:40:08.134681 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:40:08 crc kubenswrapper[4939]: I0318 17:40:08.145215 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c2ddec-2c0d-40cf-affe-104c66360d2f" path="/var/lib/kubelet/pods/f3c2ddec-2c0d-40cf-affe-104c66360d2f/volumes" Mar 18 17:40:10 crc kubenswrapper[4939]: I0318 17:40:10.438112 4939 scope.go:117] "RemoveContainer" containerID="0ef49c9224a2ade33709114c3e6dd825211fa2c970a252bd76088c51e5f12833" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.802631 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:21 crc kubenswrapper[4939]: E0318 17:40:21.803676 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43ce6b1-fa53-4c37-b049-2057fcc2542f" containerName="oc" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.803692 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43ce6b1-fa53-4c37-b049-2057fcc2542f" containerName="oc" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.803904 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43ce6b1-fa53-4c37-b049-2057fcc2542f" containerName="oc" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.805687 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.821526 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.940755 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qknb9\" (UniqueName: \"kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.941146 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:21 crc kubenswrapper[4939]: I0318 17:40:21.941242 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.044551 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.044663 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.044800 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qknb9\" (UniqueName: \"kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.045337 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.045551 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.066327 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qknb9\" (UniqueName: \"kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9\") pod \"community-operators-hqjrw\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.128268 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:22 crc kubenswrapper[4939]: I0318 17:40:22.684140 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:23 crc kubenswrapper[4939]: I0318 17:40:23.133973 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:40:23 crc kubenswrapper[4939]: E0318 17:40:23.134451 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:40:23 crc kubenswrapper[4939]: I0318 17:40:23.651370 4939 generic.go:334] "Generic (PLEG): container finished" podID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerID="4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512" exitCode=0 Mar 18 17:40:23 crc kubenswrapper[4939]: I0318 17:40:23.651417 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerDied","Data":"4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512"} Mar 18 17:40:23 crc kubenswrapper[4939]: I0318 17:40:23.651448 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerStarted","Data":"37854bc61d8eda60898ecebd7dc5163963687368d63534079ea6d41175b3bc4b"} Mar 18 17:40:24 crc kubenswrapper[4939]: I0318 17:40:24.663998 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerStarted","Data":"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79"} Mar 18 17:40:26 crc kubenswrapper[4939]: I0318 17:40:26.697024 4939 generic.go:334] "Generic (PLEG): container finished" podID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerID="1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79" exitCode=0 Mar 18 17:40:26 crc kubenswrapper[4939]: I0318 17:40:26.697115 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerDied","Data":"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79"} Mar 18 17:40:27 crc kubenswrapper[4939]: I0318 17:40:27.707342 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerStarted","Data":"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d"} Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.128916 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.129363 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.178806 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.211214 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hqjrw" podStartSLOduration=7.698093983 podStartE2EDuration="11.211188461s" podCreationTimestamp="2026-03-18 17:40:21 +0000 UTC" firstStartedPulling="2026-03-18 17:40:23.664277599 +0000 UTC m=+7388.263465210" lastFinishedPulling="2026-03-18 17:40:27.177372027 +0000 UTC m=+7391.776559688" observedRunningTime="2026-03-18 17:40:27.732292549 +0000 UTC m=+7392.331480170" watchObservedRunningTime="2026-03-18 17:40:32.211188461 +0000 UTC m=+7396.810376102" Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.797480 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:32 crc kubenswrapper[4939]: I0318 17:40:32.849094 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:34 crc kubenswrapper[4939]: I0318 17:40:34.774269 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hqjrw" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="registry-server" containerID="cri-o://0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d" gracePeriod=2 Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.385198 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.445340 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content\") pod \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.445489 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities\") pod \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.445538 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qknb9\" (UniqueName: \"kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9\") pod \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\" (UID: \"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7\") " Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.446286 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities" (OuterVolumeSpecName: "utilities") pod "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" (UID: "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.450668 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9" (OuterVolumeSpecName: "kube-api-access-qknb9") pod "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" (UID: "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7"). InnerVolumeSpecName "kube-api-access-qknb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.500385 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" (UID: "86df0d8e-0026-4f2c-a9bf-9d0ae34170e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.548788 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.548836 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.548850 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qknb9\" (UniqueName: \"kubernetes.io/projected/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7-kube-api-access-qknb9\") on node \"crc\" DevicePath \"\"" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.786657 4939 generic.go:334] "Generic (PLEG): container finished" podID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerID="0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d" exitCode=0 Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.786728 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqjrw" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.786718 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerDied","Data":"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d"} Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.787097 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqjrw" event={"ID":"86df0d8e-0026-4f2c-a9bf-9d0ae34170e7","Type":"ContainerDied","Data":"37854bc61d8eda60898ecebd7dc5163963687368d63534079ea6d41175b3bc4b"} Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.787176 4939 scope.go:117] "RemoveContainer" containerID="0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.827425 4939 scope.go:117] "RemoveContainer" containerID="1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.830481 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.840647 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hqjrw"] Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.871004 4939 scope.go:117] "RemoveContainer" containerID="4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.921575 4939 scope.go:117] "RemoveContainer" containerID="0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d" Mar 18 17:40:35 crc kubenswrapper[4939]: E0318 17:40:35.924020 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d\": container with ID starting with 0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d not found: ID does not exist" containerID="0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.924079 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d"} err="failed to get container status \"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d\": rpc error: code = NotFound desc = could not find container \"0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d\": container with ID starting with 0b435fdd0ea0c03533e3a1e47d46e86226aac4555e50e41c96557eec8b5bcf1d not found: ID does not exist" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.924118 4939 scope.go:117] "RemoveContainer" containerID="1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79" Mar 18 17:40:35 crc kubenswrapper[4939]: E0318 17:40:35.924612 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79\": container with ID starting with 1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79 not found: ID does not exist" containerID="1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.924651 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79"} err="failed to get container status \"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79\": rpc error: code = NotFound desc = could not find container \"1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79\": container with ID starting with 1bb9e796c64c243f3c8c296f87a7a33a8471e752a26c1209d35a336ddb78cb79 not found: ID does not exist" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.924747 4939 scope.go:117] "RemoveContainer" containerID="4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512" Mar 18 17:40:35 crc kubenswrapper[4939]: E0318 17:40:35.925335 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512\": container with ID starting with 4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512 not found: ID does not exist" containerID="4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512" Mar 18 17:40:35 crc kubenswrapper[4939]: I0318 17:40:35.925388 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512"} err="failed to get container status \"4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512\": rpc error: code = NotFound desc = could not find container \"4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512\": container with ID starting with 4061aa53b5e11a41448e3e91e49e4e2c51d7d484df86d1aed182f374366fd512 not found: ID does not exist" Mar 18 17:40:36 crc kubenswrapper[4939]: I0318 17:40:36.141684 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:40:36 crc kubenswrapper[4939]: E0318 17:40:36.142199 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:40:36 crc kubenswrapper[4939]: I0318 17:40:36.148540 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" path="/var/lib/kubelet/pods/86df0d8e-0026-4f2c-a9bf-9d0ae34170e7/volumes" Mar 18 17:40:49 crc kubenswrapper[4939]: I0318 17:40:49.133884 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:40:49 crc kubenswrapper[4939]: E0318 17:40:49.134738 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:02 crc kubenswrapper[4939]: I0318 17:41:02.133629 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:41:02 crc kubenswrapper[4939]: E0318 17:41:02.134345 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:15 crc kubenswrapper[4939]: I0318 17:41:15.133460 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:41:15 crc kubenswrapper[4939]: E0318 17:41:15.134614 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:27 crc kubenswrapper[4939]: I0318 17:41:27.135103 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:41:27 crc kubenswrapper[4939]: E0318 17:41:27.136194 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:35 crc kubenswrapper[4939]: I0318 17:41:35.389729 4939 generic.go:334] "Generic (PLEG): container finished" podID="ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" containerID="0d270b9705693f5ac63a7b00714f68301847001150e54415609779002f628455" exitCode=0 Mar 18 17:41:35 crc kubenswrapper[4939]: I0318 17:41:35.389831 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" event={"ID":"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a","Type":"ContainerDied","Data":"0d270b9705693f5ac63a7b00714f68301847001150e54415609779002f628455"} Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.860919 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.987712 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory\") pod \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.988245 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph\") pod \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.988357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcft8\" (UniqueName: \"kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8\") pod \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.988564 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1\") pod \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\" (UID: \"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a\") " Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.993678 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph" (OuterVolumeSpecName: "ceph") pod "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" (UID: "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:41:36 crc kubenswrapper[4939]: I0318 17:41:36.993761 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8" (OuterVolumeSpecName: "kube-api-access-dcft8") pod "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" (UID: "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a"). InnerVolumeSpecName "kube-api-access-dcft8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.016018 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory" (OuterVolumeSpecName: "inventory") pod "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" (UID: "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.016485 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" (UID: "ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.091380 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.091424 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.091436 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcft8\" (UniqueName: \"kubernetes.io/projected/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-kube-api-access-dcft8\") on node \"crc\" DevicePath \"\"" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.091447 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.426121 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" event={"ID":"ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a","Type":"ContainerDied","Data":"12d23f23bd9c9f533a2e53b766955543734e1bde86fa7711285d41b221de7754"} Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.426159 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d23f23bd9c9f533a2e53b766955543734e1bde86fa7711285d41b221de7754" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.426215 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fs8vm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.515191 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f9cwm"] Mar 18 17:41:37 crc kubenswrapper[4939]: E0318 17:41:37.515741 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" containerName="download-cache-openstack-openstack-cell1" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.515766 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" containerName="download-cache-openstack-openstack-cell1" Mar 18 17:41:37 crc kubenswrapper[4939]: E0318 17:41:37.515804 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="registry-server" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.515814 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="registry-server" Mar 18 17:41:37 crc kubenswrapper[4939]: E0318 17:41:37.515832 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="extract-content" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.515840 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="extract-content" Mar 18 17:41:37 crc kubenswrapper[4939]: E0318 17:41:37.515865 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="extract-utilities" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.515872 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="extract-utilities" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.516101 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="86df0d8e-0026-4f2c-a9bf-9d0ae34170e7" containerName="registry-server" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.516131 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a" containerName="download-cache-openstack-openstack-cell1" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.517058 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.520241 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.520357 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.520801 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.521299 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.528446 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f9cwm"] Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.601955 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.602114 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.602321 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6szpz\" (UniqueName: \"kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.602378 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.703638 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.703746 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6szpz\" (UniqueName: \"kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.703771 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.703803 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.708444 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.708771 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.712087 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.719428 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6szpz\" (UniqueName: \"kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz\") pod \"configure-network-openstack-openstack-cell1-f9cwm\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:37 crc kubenswrapper[4939]: I0318 17:41:37.835468 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:41:38 crc kubenswrapper[4939]: I0318 17:41:38.515566 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f9cwm"] Mar 18 17:41:39 crc kubenswrapper[4939]: I0318 17:41:39.133877 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:41:39 crc kubenswrapper[4939]: E0318 17:41:39.134853 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:39 crc kubenswrapper[4939]: I0318 17:41:39.445256 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" event={"ID":"d8999dd3-4065-4ecc-b797-e62aa63e1bcb","Type":"ContainerStarted","Data":"070e37668983703bbd9d63093c2719898dad8ff8c2dc0e7302fe2aa9cc723def"} Mar 18 17:41:39 crc kubenswrapper[4939]: I0318 17:41:39.445310 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" event={"ID":"d8999dd3-4065-4ecc-b797-e62aa63e1bcb","Type":"ContainerStarted","Data":"0d5f8d19f2fc76b8407cef8c1b792364bdd25edade55092066594262f825dc21"} Mar 18 17:41:39 crc kubenswrapper[4939]: I0318 17:41:39.483332 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" podStartSLOduration=2.290653506 podStartE2EDuration="2.483308884s" podCreationTimestamp="2026-03-18 17:41:37 +0000 UTC" firstStartedPulling="2026-03-18 17:41:38.529089323 +0000 UTC m=+7463.128276944" lastFinishedPulling="2026-03-18 17:41:38.721744701 +0000 UTC m=+7463.320932322" observedRunningTime="2026-03-18 17:41:39.475922063 +0000 UTC m=+7464.075109694" watchObservedRunningTime="2026-03-18 17:41:39.483308884 +0000 UTC m=+7464.082496505" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.787742 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.790887 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.824818 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.849849 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.850137 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m975z\" (UniqueName: \"kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.850253 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.952011 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.952252 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m975z\" (UniqueName: \"kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.952403 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.952645 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.952921 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:43 crc kubenswrapper[4939]: I0318 17:41:43.983478 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m975z\" (UniqueName: \"kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z\") pod \"redhat-operators-krc58\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:44 crc kubenswrapper[4939]: I0318 17:41:44.120803 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:41:44 crc kubenswrapper[4939]: I0318 17:41:44.632488 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 17:41:45 crc kubenswrapper[4939]: I0318 17:41:45.503352 4939 generic.go:334] "Generic (PLEG): container finished" podID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerID="d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df" exitCode=0 Mar 18 17:41:45 crc kubenswrapper[4939]: I0318 17:41:45.503460 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerDied","Data":"d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df"} Mar 18 17:41:45 crc kubenswrapper[4939]: I0318 17:41:45.503680 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerStarted","Data":"1721f863992c2d871bd16fa86374d542c41a2109ce33908f3eacae5be62aef91"} Mar 18 17:41:54 crc kubenswrapper[4939]: I0318 17:41:54.133707 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:41:54 crc kubenswrapper[4939]: E0318 17:41:54.134779 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:41:57 crc kubenswrapper[4939]: I0318 17:41:57.633892 4939 generic.go:334] "Generic (PLEG): container finished" podID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerID="030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379" exitCode=0 Mar 18 17:41:57 crc kubenswrapper[4939]: I0318 17:41:57.634007 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerDied","Data":"030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379"} Mar 18 17:41:58 crc kubenswrapper[4939]: I0318 17:41:58.649475 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerStarted","Data":"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9"} Mar 18 17:41:58 crc kubenswrapper[4939]: I0318 17:41:58.682410 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-krc58" podStartSLOduration=3.114131855 podStartE2EDuration="15.682382796s" podCreationTimestamp="2026-03-18 17:41:43 +0000 UTC" firstStartedPulling="2026-03-18 17:41:45.505703151 +0000 UTC m=+7470.104890772" lastFinishedPulling="2026-03-18 17:41:58.073954092 +0000 UTC m=+7482.673141713" observedRunningTime="2026-03-18 17:41:58.682008213 +0000 UTC m=+7483.281195854" watchObservedRunningTime="2026-03-18 17:41:58.682382796 +0000 UTC m=+7483.281570427" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.130996 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564262-4z69m"] Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.133142 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.142939 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.143207 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.144008 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.146341 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564262-4z69m"] Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.317694 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmthd\" (UniqueName: \"kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd\") pod \"auto-csr-approver-29564262-4z69m\" (UID: \"560019e0-f873-4da0-ac07-fdb41015f2eb\") " pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.419350 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmthd\" (UniqueName: \"kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd\") pod \"auto-csr-approver-29564262-4z69m\" (UID: \"560019e0-f873-4da0-ac07-fdb41015f2eb\") " pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.441188 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmthd\" (UniqueName: \"kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd\") pod \"auto-csr-approver-29564262-4z69m\" (UID: \"560019e0-f873-4da0-ac07-fdb41015f2eb\") " pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:00 crc kubenswrapper[4939]: I0318 17:42:00.513767 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:01 crc kubenswrapper[4939]: I0318 17:42:01.270363 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564262-4z69m"] Mar 18 17:42:01 crc kubenswrapper[4939]: W0318 17:42:01.271324 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod560019e0_f873_4da0_ac07_fdb41015f2eb.slice/crio-331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3 WatchSource:0}: Error finding container 331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3: Status 404 returned error can't find the container with id 331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3 Mar 18 17:42:01 crc kubenswrapper[4939]: I0318 17:42:01.683358 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564262-4z69m" event={"ID":"560019e0-f873-4da0-ac07-fdb41015f2eb","Type":"ContainerStarted","Data":"331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3"} Mar 18 17:42:03 crc kubenswrapper[4939]: I0318 17:42:03.708961 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564262-4z69m" event={"ID":"560019e0-f873-4da0-ac07-fdb41015f2eb","Type":"ContainerStarted","Data":"3ba8bd266ee21e740a289b27fc54d71b684cf2af3010455be2b89c6e203aceed"} Mar 18 17:42:03 crc kubenswrapper[4939]: I0318 17:42:03.733585 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564262-4z69m" podStartSLOduration=2.412317943 podStartE2EDuration="3.733552628s" podCreationTimestamp="2026-03-18 17:42:00 +0000 UTC" firstStartedPulling="2026-03-18 17:42:01.273685454 +0000 UTC m=+7485.872873075" lastFinishedPulling="2026-03-18 17:42:02.594920139 +0000 UTC m=+7487.194107760" observedRunningTime="2026-03-18 17:42:03.728457316 +0000 UTC m=+7488.327644977" watchObservedRunningTime="2026-03-18 17:42:03.733552628 +0000 UTC m=+7488.332740289" Mar 18 17:42:04 crc kubenswrapper[4939]: I0318 17:42:04.121962 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:42:04 crc kubenswrapper[4939]: I0318 17:42:04.122078 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:42:04 crc kubenswrapper[4939]: I0318 17:42:04.720518 4939 generic.go:334] "Generic (PLEG): container finished" podID="560019e0-f873-4da0-ac07-fdb41015f2eb" containerID="3ba8bd266ee21e740a289b27fc54d71b684cf2af3010455be2b89c6e203aceed" exitCode=0 Mar 18 17:42:04 crc kubenswrapper[4939]: I0318 17:42:04.720632 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564262-4z69m" event={"ID":"560019e0-f873-4da0-ac07-fdb41015f2eb","Type":"ContainerDied","Data":"3ba8bd266ee21e740a289b27fc54d71b684cf2af3010455be2b89c6e203aceed"} Mar 18 17:42:05 crc kubenswrapper[4939]: I0318 17:42:05.189745 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-krc58" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="registry-server" probeResult="failure" output=< Mar 18 17:42:05 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:42:05 crc kubenswrapper[4939]: > Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.131273 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.251967 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmthd\" (UniqueName: \"kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd\") pod \"560019e0-f873-4da0-ac07-fdb41015f2eb\" (UID: \"560019e0-f873-4da0-ac07-fdb41015f2eb\") " Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.270732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd" (OuterVolumeSpecName: "kube-api-access-hmthd") pod "560019e0-f873-4da0-ac07-fdb41015f2eb" (UID: "560019e0-f873-4da0-ac07-fdb41015f2eb"). InnerVolumeSpecName "kube-api-access-hmthd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.355368 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmthd\" (UniqueName: \"kubernetes.io/projected/560019e0-f873-4da0-ac07-fdb41015f2eb-kube-api-access-hmthd\") on node \"crc\" DevicePath \"\"" Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.757820 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564262-4z69m" event={"ID":"560019e0-f873-4da0-ac07-fdb41015f2eb","Type":"ContainerDied","Data":"331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3"} Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.758393 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331e1cb2776de2528707cb515b607c1bdf57e86721fd179dd52044887cfd7ba3" Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.758462 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564262-4z69m" Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.810029 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564256-zc59b"] Mar 18 17:42:06 crc kubenswrapper[4939]: I0318 17:42:06.819846 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564256-zc59b"] Mar 18 17:42:08 crc kubenswrapper[4939]: I0318 17:42:08.148420 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4" path="/var/lib/kubelet/pods/0ebfcd96-ac97-4a7b-9818-f5d5e30cf1e4/volumes" Mar 18 17:42:09 crc kubenswrapper[4939]: I0318 17:42:09.133374 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:42:09 crc kubenswrapper[4939]: E0318 17:42:09.134078 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:42:10 crc kubenswrapper[4939]: I0318 17:42:10.548310 4939 scope.go:117] "RemoveContainer" containerID="ef791a4a3d48426c8760e7a173541b3e8890d02e7566be1cbf69fd826a56e885" Mar 18 17:42:14 crc kubenswrapper[4939]: I0318 17:42:14.180889 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:42:14 crc kubenswrapper[4939]: I0318 17:42:14.236622 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 17:42:14 crc kubenswrapper[4939]: I0318 17:42:14.820899 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 17:42:14 crc kubenswrapper[4939]: I0318 17:42:14.990045 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 17:42:14 crc kubenswrapper[4939]: I0318 17:42:14.990332 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wq6xx" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="registry-server" containerID="cri-o://792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e" gracePeriod=2 Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.523432 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.702810 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content\") pod \"114d30cb-4856-4d24-919a-1726f288db27\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.703022 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities\") pod \"114d30cb-4856-4d24-919a-1726f288db27\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.703093 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pthj5\" (UniqueName: \"kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5\") pod \"114d30cb-4856-4d24-919a-1726f288db27\" (UID: \"114d30cb-4856-4d24-919a-1726f288db27\") " Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.704163 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities" (OuterVolumeSpecName: "utilities") pod "114d30cb-4856-4d24-919a-1726f288db27" (UID: "114d30cb-4856-4d24-919a-1726f288db27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.709782 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5" (OuterVolumeSpecName: "kube-api-access-pthj5") pod "114d30cb-4856-4d24-919a-1726f288db27" (UID: "114d30cb-4856-4d24-919a-1726f288db27"). InnerVolumeSpecName "kube-api-access-pthj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.805813 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.805855 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pthj5\" (UniqueName: \"kubernetes.io/projected/114d30cb-4856-4d24-919a-1726f288db27-kube-api-access-pthj5\") on node \"crc\" DevicePath \"\"" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.868686 4939 generic.go:334] "Generic (PLEG): container finished" podID="114d30cb-4856-4d24-919a-1726f288db27" containerID="792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e" exitCode=0 Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.868742 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerDied","Data":"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e"} Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.868773 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wq6xx" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.868803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wq6xx" event={"ID":"114d30cb-4856-4d24-919a-1726f288db27","Type":"ContainerDied","Data":"045942dcb83c57e1dd3f0fb133a46c8ed001ca48998ad0674830e956713c19c1"} Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.868832 4939 scope.go:117] "RemoveContainer" containerID="792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.892617 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "114d30cb-4856-4d24-919a-1726f288db27" (UID: "114d30cb-4856-4d24-919a-1726f288db27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.900401 4939 scope.go:117] "RemoveContainer" containerID="31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.914026 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/114d30cb-4856-4d24-919a-1726f288db27-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.929661 4939 scope.go:117] "RemoveContainer" containerID="55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.974754 4939 scope.go:117] "RemoveContainer" containerID="792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e" Mar 18 17:42:15 crc kubenswrapper[4939]: E0318 17:42:15.975223 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e\": container with ID starting with 792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e not found: ID does not exist" containerID="792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.975256 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e"} err="failed to get container status \"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e\": rpc error: code = NotFound desc = could not find container \"792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e\": container with ID starting with 792a9b319c7075febfd84346ef450ebb97562e3fbb83a4112c1dc85edd62073e not found: ID does not exist" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.975277 4939 scope.go:117] "RemoveContainer" containerID="31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed" Mar 18 17:42:15 crc kubenswrapper[4939]: E0318 17:42:15.975732 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed\": container with ID starting with 31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed not found: ID does not exist" containerID="31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.975784 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed"} err="failed to get container status \"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed\": rpc error: code = NotFound desc = could not find container \"31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed\": container with ID starting with 31d34a20cb4c3831e316a0ea830d3ef91559e7fe9ca9ca9c6b7d3162bb2063ed not found: ID does not exist" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.975820 4939 scope.go:117] "RemoveContainer" containerID="55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3" Mar 18 17:42:15 crc kubenswrapper[4939]: E0318 17:42:15.976204 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3\": container with ID starting with 55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3 not found: ID does not exist" containerID="55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3" Mar 18 17:42:15 crc kubenswrapper[4939]: I0318 17:42:15.976252 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3"} err="failed to get container status \"55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3\": rpc error: code = NotFound desc = could not find container \"55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3\": container with ID starting with 55138872b05fd7a6fa13d673703c9d39f8447637541445860680d706700869f3 not found: ID does not exist" Mar 18 17:42:16 crc kubenswrapper[4939]: I0318 17:42:16.201703 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 17:42:16 crc kubenswrapper[4939]: I0318 17:42:16.221344 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wq6xx"] Mar 18 17:42:18 crc kubenswrapper[4939]: I0318 17:42:18.151115 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="114d30cb-4856-4d24-919a-1726f288db27" path="/var/lib/kubelet/pods/114d30cb-4856-4d24-919a-1726f288db27/volumes" Mar 18 17:42:20 crc kubenswrapper[4939]: I0318 17:42:20.134148 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:42:20 crc kubenswrapper[4939]: E0318 17:42:20.134778 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:42:31 crc kubenswrapper[4939]: I0318 17:42:31.134071 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:42:31 crc kubenswrapper[4939]: E0318 17:42:31.135020 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:42:42 crc kubenswrapper[4939]: I0318 17:42:42.137635 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:42:42 crc kubenswrapper[4939]: E0318 17:42:42.138897 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:42:57 crc kubenswrapper[4939]: I0318 17:42:57.133337 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:42:57 crc kubenswrapper[4939]: E0318 17:42:57.134093 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:43:02 crc kubenswrapper[4939]: I0318 17:43:02.368520 4939 generic.go:334] "Generic (PLEG): container finished" podID="d8999dd3-4065-4ecc-b797-e62aa63e1bcb" containerID="070e37668983703bbd9d63093c2719898dad8ff8c2dc0e7302fe2aa9cc723def" exitCode=0 Mar 18 17:43:02 crc kubenswrapper[4939]: I0318 17:43:02.368951 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" event={"ID":"d8999dd3-4065-4ecc-b797-e62aa63e1bcb","Type":"ContainerDied","Data":"070e37668983703bbd9d63093c2719898dad8ff8c2dc0e7302fe2aa9cc723def"} Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.818519 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.946400 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory\") pod \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.947199 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1\") pod \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.947264 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph\") pod \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.947386 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6szpz\" (UniqueName: \"kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz\") pod \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\" (UID: \"d8999dd3-4065-4ecc-b797-e62aa63e1bcb\") " Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.953804 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph" (OuterVolumeSpecName: "ceph") pod "d8999dd3-4065-4ecc-b797-e62aa63e1bcb" (UID: "d8999dd3-4065-4ecc-b797-e62aa63e1bcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.953841 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz" (OuterVolumeSpecName: "kube-api-access-6szpz") pod "d8999dd3-4065-4ecc-b797-e62aa63e1bcb" (UID: "d8999dd3-4065-4ecc-b797-e62aa63e1bcb"). InnerVolumeSpecName "kube-api-access-6szpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.978647 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory" (OuterVolumeSpecName: "inventory") pod "d8999dd3-4065-4ecc-b797-e62aa63e1bcb" (UID: "d8999dd3-4065-4ecc-b797-e62aa63e1bcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:03 crc kubenswrapper[4939]: I0318 17:43:03.984945 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d8999dd3-4065-4ecc-b797-e62aa63e1bcb" (UID: "d8999dd3-4065-4ecc-b797-e62aa63e1bcb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.050631 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.050673 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6szpz\" (UniqueName: \"kubernetes.io/projected/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-kube-api-access-6szpz\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.050688 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.050701 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8999dd3-4065-4ecc-b797-e62aa63e1bcb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.391464 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" event={"ID":"d8999dd3-4065-4ecc-b797-e62aa63e1bcb","Type":"ContainerDied","Data":"0d5f8d19f2fc76b8407cef8c1b792364bdd25edade55092066594262f825dc21"} Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.391539 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d5f8d19f2fc76b8407cef8c1b792364bdd25edade55092066594262f825dc21" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.391607 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f9cwm" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.504253 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8zvhb"] Mar 18 17:43:04 crc kubenswrapper[4939]: E0318 17:43:04.505046 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8999dd3-4065-4ecc-b797-e62aa63e1bcb" containerName="configure-network-openstack-openstack-cell1" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505066 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8999dd3-4065-4ecc-b797-e62aa63e1bcb" containerName="configure-network-openstack-openstack-cell1" Mar 18 17:43:04 crc kubenswrapper[4939]: E0318 17:43:04.505088 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="registry-server" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505095 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="registry-server" Mar 18 17:43:04 crc kubenswrapper[4939]: E0318 17:43:04.505132 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="extract-utilities" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505141 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="extract-utilities" Mar 18 17:43:04 crc kubenswrapper[4939]: E0318 17:43:04.505163 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="extract-content" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505170 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="extract-content" Mar 18 17:43:04 crc kubenswrapper[4939]: E0318 17:43:04.505187 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560019e0-f873-4da0-ac07-fdb41015f2eb" containerName="oc" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505195 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="560019e0-f873-4da0-ac07-fdb41015f2eb" containerName="oc" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505428 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="114d30cb-4856-4d24-919a-1726f288db27" containerName="registry-server" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505447 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8999dd3-4065-4ecc-b797-e62aa63e1bcb" containerName="configure-network-openstack-openstack-cell1" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.505455 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="560019e0-f873-4da0-ac07-fdb41015f2eb" containerName="oc" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.506330 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.508494 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.508978 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.510051 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.510427 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.588611 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8zvhb"] Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.679414 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.679481 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.679546 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmjs\" (UniqueName: \"kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.679573 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.782182 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmjs\" (UniqueName: \"kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.782244 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.782559 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.782597 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.788985 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.789943 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.796247 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.807881 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmjs\" (UniqueName: \"kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs\") pod \"validate-network-openstack-openstack-cell1-8zvhb\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:04 crc kubenswrapper[4939]: I0318 17:43:04.880198 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:05 crc kubenswrapper[4939]: I0318 17:43:05.642477 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8zvhb"] Mar 18 17:43:06 crc kubenswrapper[4939]: I0318 17:43:06.439573 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" event={"ID":"2d2a4c53-2fed-4077-b7cd-41720e50faa5","Type":"ContainerStarted","Data":"4b7b4d2e310720e9041943942db775d756cb83d62b3477deb5c5288c7b1fe39b"} Mar 18 17:43:06 crc kubenswrapper[4939]: I0318 17:43:06.439890 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" event={"ID":"2d2a4c53-2fed-4077-b7cd-41720e50faa5","Type":"ContainerStarted","Data":"bbb8300df62c6cee3e715d36096a122d668ab453c6b7c0f82b0419182e258d28"} Mar 18 17:43:06 crc kubenswrapper[4939]: I0318 17:43:06.484774 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" podStartSLOduration=2.257909729 podStartE2EDuration="2.484753954s" podCreationTimestamp="2026-03-18 17:43:04 +0000 UTC" firstStartedPulling="2026-03-18 17:43:05.650780772 +0000 UTC m=+7550.249968393" lastFinishedPulling="2026-03-18 17:43:05.877624997 +0000 UTC m=+7550.476812618" observedRunningTime="2026-03-18 17:43:06.473486293 +0000 UTC m=+7551.072673914" watchObservedRunningTime="2026-03-18 17:43:06.484753954 +0000 UTC m=+7551.083941575" Mar 18 17:43:11 crc kubenswrapper[4939]: I0318 17:43:11.133753 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:43:11 crc kubenswrapper[4939]: E0318 17:43:11.134523 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:43:11 crc kubenswrapper[4939]: I0318 17:43:11.484925 4939 generic.go:334] "Generic (PLEG): container finished" podID="2d2a4c53-2fed-4077-b7cd-41720e50faa5" containerID="4b7b4d2e310720e9041943942db775d756cb83d62b3477deb5c5288c7b1fe39b" exitCode=0 Mar 18 17:43:11 crc kubenswrapper[4939]: I0318 17:43:11.484968 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" event={"ID":"2d2a4c53-2fed-4077-b7cd-41720e50faa5","Type":"ContainerDied","Data":"4b7b4d2e310720e9041943942db775d756cb83d62b3477deb5c5288c7b1fe39b"} Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.094709 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.163256 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1\") pod \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.163468 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory\") pod \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.163652 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmjs\" (UniqueName: \"kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs\") pod \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.163790 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph\") pod \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\" (UID: \"2d2a4c53-2fed-4077-b7cd-41720e50faa5\") " Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.168766 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph" (OuterVolumeSpecName: "ceph") pod "2d2a4c53-2fed-4077-b7cd-41720e50faa5" (UID: "2d2a4c53-2fed-4077-b7cd-41720e50faa5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.178785 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs" (OuterVolumeSpecName: "kube-api-access-gwmjs") pod "2d2a4c53-2fed-4077-b7cd-41720e50faa5" (UID: "2d2a4c53-2fed-4077-b7cd-41720e50faa5"). InnerVolumeSpecName "kube-api-access-gwmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.193145 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2d2a4c53-2fed-4077-b7cd-41720e50faa5" (UID: "2d2a4c53-2fed-4077-b7cd-41720e50faa5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.195934 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory" (OuterVolumeSpecName: "inventory") pod "2d2a4c53-2fed-4077-b7cd-41720e50faa5" (UID: "2d2a4c53-2fed-4077-b7cd-41720e50faa5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.267531 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.267569 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.267582 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2a4c53-2fed-4077-b7cd-41720e50faa5-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.267594 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmjs\" (UniqueName: \"kubernetes.io/projected/2d2a4c53-2fed-4077-b7cd-41720e50faa5-kube-api-access-gwmjs\") on node \"crc\" DevicePath \"\"" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.508979 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" event={"ID":"2d2a4c53-2fed-4077-b7cd-41720e50faa5","Type":"ContainerDied","Data":"bbb8300df62c6cee3e715d36096a122d668ab453c6b7c0f82b0419182e258d28"} Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.509271 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbb8300df62c6cee3e715d36096a122d668ab453c6b7c0f82b0419182e258d28" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.509039 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8zvhb" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.575574 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hwz5d"] Mar 18 17:43:13 crc kubenswrapper[4939]: E0318 17:43:13.576229 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2a4c53-2fed-4077-b7cd-41720e50faa5" containerName="validate-network-openstack-openstack-cell1" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.576263 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2a4c53-2fed-4077-b7cd-41720e50faa5" containerName="validate-network-openstack-openstack-cell1" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.576712 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2a4c53-2fed-4077-b7cd-41720e50faa5" containerName="validate-network-openstack-openstack-cell1" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.577923 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.583713 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.584134 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.584231 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.586332 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.590184 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hwz5d"] Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.674812 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.674909 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6rv\" (UniqueName: \"kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.675246 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.675463 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.777409 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.779217 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6rv\" (UniqueName: \"kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.779590 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.779806 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.783143 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.783666 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.784211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.801840 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6rv\" (UniqueName: \"kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv\") pod \"install-os-openstack-openstack-cell1-hwz5d\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:13 crc kubenswrapper[4939]: I0318 17:43:13.895693 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:43:14 crc kubenswrapper[4939]: I0318 17:43:14.449242 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hwz5d"] Mar 18 17:43:14 crc kubenswrapper[4939]: I0318 17:43:14.517997 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" event={"ID":"a2bb63d4-ae06-4c42-b037-202212427175","Type":"ContainerStarted","Data":"1d803fbaad8ce6c79af8efbedb48513095882c88f679b80b4198728e8488fd85"} Mar 18 17:43:15 crc kubenswrapper[4939]: I0318 17:43:15.531164 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" event={"ID":"a2bb63d4-ae06-4c42-b037-202212427175","Type":"ContainerStarted","Data":"900d092bc71e6bd2a155775f9cdfe1494c9ab54be39f8a1f418876d60160a4cf"} Mar 18 17:43:15 crc kubenswrapper[4939]: I0318 17:43:15.567647 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" podStartSLOduration=2.388286059 podStartE2EDuration="2.567617994s" podCreationTimestamp="2026-03-18 17:43:13 +0000 UTC" firstStartedPulling="2026-03-18 17:43:14.46265263 +0000 UTC m=+7559.061840241" lastFinishedPulling="2026-03-18 17:43:14.641984565 +0000 UTC m=+7559.241172176" observedRunningTime="2026-03-18 17:43:15.555184947 +0000 UTC m=+7560.154372588" watchObservedRunningTime="2026-03-18 17:43:15.567617994 +0000 UTC m=+7560.166805645" Mar 18 17:43:26 crc kubenswrapper[4939]: I0318 17:43:26.140795 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:43:26 crc kubenswrapper[4939]: E0318 17:43:26.141886 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:43:39 crc kubenswrapper[4939]: I0318 17:43:39.132721 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:43:39 crc kubenswrapper[4939]: E0318 17:43:39.133519 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:43:50 crc kubenswrapper[4939]: I0318 17:43:50.134055 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:43:50 crc kubenswrapper[4939]: E0318 17:43:50.135125 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.149326 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564264-4jh8p"] Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.151240 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.153902 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.154665 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.155014 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.166145 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564264-4jh8p"] Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.244278 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2btj\" (UniqueName: \"kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj\") pod \"auto-csr-approver-29564264-4jh8p\" (UID: \"75829c57-7661-4729-9d85-6880e9c590d9\") " pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.346201 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2btj\" (UniqueName: \"kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj\") pod \"auto-csr-approver-29564264-4jh8p\" (UID: \"75829c57-7661-4729-9d85-6880e9c590d9\") " pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.371974 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2btj\" (UniqueName: \"kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj\") pod \"auto-csr-approver-29564264-4jh8p\" (UID: \"75829c57-7661-4729-9d85-6880e9c590d9\") " pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:00 crc kubenswrapper[4939]: I0318 17:44:00.614464 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:01 crc kubenswrapper[4939]: I0318 17:44:01.114304 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564264-4jh8p"] Mar 18 17:44:02 crc kubenswrapper[4939]: I0318 17:44:02.000276 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" event={"ID":"75829c57-7661-4729-9d85-6880e9c590d9","Type":"ContainerStarted","Data":"a7c24f4897699fd570f07834c4808c14f894e2c23409e228fdbd671c3d7ead2a"} Mar 18 17:44:02 crc kubenswrapper[4939]: I0318 17:44:02.135157 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:44:02 crc kubenswrapper[4939]: E0318 17:44:02.135412 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:03 crc kubenswrapper[4939]: I0318 17:44:03.015173 4939 generic.go:334] "Generic (PLEG): container finished" podID="75829c57-7661-4729-9d85-6880e9c590d9" containerID="32254f0846e0726affea67279b11c25d4d06a36e27a536c8147d49d156aed5e3" exitCode=0 Mar 18 17:44:03 crc kubenswrapper[4939]: I0318 17:44:03.015229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" event={"ID":"75829c57-7661-4729-9d85-6880e9c590d9","Type":"ContainerDied","Data":"32254f0846e0726affea67279b11c25d4d06a36e27a536c8147d49d156aed5e3"} Mar 18 17:44:03 crc kubenswrapper[4939]: I0318 17:44:03.018937 4939 generic.go:334] "Generic (PLEG): container finished" podID="a2bb63d4-ae06-4c42-b037-202212427175" containerID="900d092bc71e6bd2a155775f9cdfe1494c9ab54be39f8a1f418876d60160a4cf" exitCode=0 Mar 18 17:44:03 crc kubenswrapper[4939]: I0318 17:44:03.018975 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" event={"ID":"a2bb63d4-ae06-4c42-b037-202212427175","Type":"ContainerDied","Data":"900d092bc71e6bd2a155775f9cdfe1494c9ab54be39f8a1f418876d60160a4cf"} Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.591498 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.598484 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.750357 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2btj\" (UniqueName: \"kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj\") pod \"75829c57-7661-4729-9d85-6880e9c590d9\" (UID: \"75829c57-7661-4729-9d85-6880e9c590d9\") " Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.750435 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc6rv\" (UniqueName: \"kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv\") pod \"a2bb63d4-ae06-4c42-b037-202212427175\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.750564 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory\") pod \"a2bb63d4-ae06-4c42-b037-202212427175\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.750656 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1\") pod \"a2bb63d4-ae06-4c42-b037-202212427175\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.750843 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph\") pod \"a2bb63d4-ae06-4c42-b037-202212427175\" (UID: \"a2bb63d4-ae06-4c42-b037-202212427175\") " Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.756968 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv" (OuterVolumeSpecName: "kube-api-access-kc6rv") pod "a2bb63d4-ae06-4c42-b037-202212427175" (UID: "a2bb63d4-ae06-4c42-b037-202212427175"). InnerVolumeSpecName "kube-api-access-kc6rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.757679 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph" (OuterVolumeSpecName: "ceph") pod "a2bb63d4-ae06-4c42-b037-202212427175" (UID: "a2bb63d4-ae06-4c42-b037-202212427175"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.758043 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj" (OuterVolumeSpecName: "kube-api-access-m2btj") pod "75829c57-7661-4729-9d85-6880e9c590d9" (UID: "75829c57-7661-4729-9d85-6880e9c590d9"). InnerVolumeSpecName "kube-api-access-m2btj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.785047 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory" (OuterVolumeSpecName: "inventory") pod "a2bb63d4-ae06-4c42-b037-202212427175" (UID: "a2bb63d4-ae06-4c42-b037-202212427175"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.794129 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a2bb63d4-ae06-4c42-b037-202212427175" (UID: "a2bb63d4-ae06-4c42-b037-202212427175"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.853403 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.853443 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.853459 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a2bb63d4-ae06-4c42-b037-202212427175-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.853473 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2btj\" (UniqueName: \"kubernetes.io/projected/75829c57-7661-4729-9d85-6880e9c590d9-kube-api-access-m2btj\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:04 crc kubenswrapper[4939]: I0318 17:44:04.853485 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc6rv\" (UniqueName: \"kubernetes.io/projected/a2bb63d4-ae06-4c42-b037-202212427175-kube-api-access-kc6rv\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.043066 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.043062 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564264-4jh8p" event={"ID":"75829c57-7661-4729-9d85-6880e9c590d9","Type":"ContainerDied","Data":"a7c24f4897699fd570f07834c4808c14f894e2c23409e228fdbd671c3d7ead2a"} Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.043215 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c24f4897699fd570f07834c4808c14f894e2c23409e228fdbd671c3d7ead2a" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.044931 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" event={"ID":"a2bb63d4-ae06-4c42-b037-202212427175","Type":"ContainerDied","Data":"1d803fbaad8ce6c79af8efbedb48513095882c88f679b80b4198728e8488fd85"} Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.045040 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d803fbaad8ce6c79af8efbedb48513095882c88f679b80b4198728e8488fd85" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.045048 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hwz5d" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.141182 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-8x8pz"] Mar 18 17:44:05 crc kubenswrapper[4939]: E0318 17:44:05.141727 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75829c57-7661-4729-9d85-6880e9c590d9" containerName="oc" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.141751 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="75829c57-7661-4729-9d85-6880e9c590d9" containerName="oc" Mar 18 17:44:05 crc kubenswrapper[4939]: E0318 17:44:05.141809 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bb63d4-ae06-4c42-b037-202212427175" containerName="install-os-openstack-openstack-cell1" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.141821 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bb63d4-ae06-4c42-b037-202212427175" containerName="install-os-openstack-openstack-cell1" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.142072 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="75829c57-7661-4729-9d85-6880e9c590d9" containerName="oc" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.142108 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bb63d4-ae06-4c42-b037-202212427175" containerName="install-os-openstack-openstack-cell1" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.143068 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.144913 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.145014 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.145878 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.146807 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.153551 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-8x8pz"] Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.261468 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.261772 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.261818 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9mr\" (UniqueName: \"kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.261839 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.363626 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.364028 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9mr\" (UniqueName: \"kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.364064 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.364205 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.368661 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.373216 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.373417 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.383818 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9mr\" (UniqueName: \"kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr\") pod \"configure-os-openstack-openstack-cell1-8x8pz\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:05 crc kubenswrapper[4939]: I0318 17:44:05.462870 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:06 crc kubenswrapper[4939]: I0318 17:44:06.551593 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564258-fbq82"] Mar 18 17:44:06 crc kubenswrapper[4939]: I0318 17:44:06.560530 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564258-fbq82"] Mar 18 17:44:06 crc kubenswrapper[4939]: I0318 17:44:06.875863 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-8x8pz"] Mar 18 17:44:07 crc kubenswrapper[4939]: I0318 17:44:07.551317 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" event={"ID":"efc5f76b-7c2a-439b-b176-cc74498f19c4","Type":"ContainerStarted","Data":"8eab13e006c275619e6820724c2a82710e212a093221c7476d4731aee78bd8f4"} Mar 18 17:44:07 crc kubenswrapper[4939]: I0318 17:44:07.551683 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" event={"ID":"efc5f76b-7c2a-439b-b176-cc74498f19c4","Type":"ContainerStarted","Data":"01236455aa2a0dcf7f8cf8cdde4ecfdc0589476d9f86591fca0d10592c74a930"} Mar 18 17:44:07 crc kubenswrapper[4939]: I0318 17:44:07.595027 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" podStartSLOduration=2.325610392 podStartE2EDuration="2.594983412s" podCreationTimestamp="2026-03-18 17:44:05 +0000 UTC" firstStartedPulling="2026-03-18 17:44:06.865430345 +0000 UTC m=+7611.464617956" lastFinishedPulling="2026-03-18 17:44:07.134803345 +0000 UTC m=+7611.733990976" observedRunningTime="2026-03-18 17:44:07.575208947 +0000 UTC m=+7612.174396578" watchObservedRunningTime="2026-03-18 17:44:07.594983412 +0000 UTC m=+7612.194171043" Mar 18 17:44:08 crc kubenswrapper[4939]: I0318 17:44:08.146998 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b425342a-9f1f-4688-83c0-6a38872386e8" path="/var/lib/kubelet/pods/b425342a-9f1f-4688-83c0-6a38872386e8/volumes" Mar 18 17:44:10 crc kubenswrapper[4939]: I0318 17:44:10.692774 4939 scope.go:117] "RemoveContainer" containerID="569b5c2d2eadc1656949f726e482cd35a89d8d015e385700a18f8f0d1bd37083" Mar 18 17:44:15 crc kubenswrapper[4939]: I0318 17:44:15.134305 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:44:15 crc kubenswrapper[4939]: E0318 17:44:15.137642 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:26 crc kubenswrapper[4939]: I0318 17:44:26.133255 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:44:26 crc kubenswrapper[4939]: E0318 17:44:26.134102 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:40 crc kubenswrapper[4939]: I0318 17:44:40.133623 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:44:40 crc kubenswrapper[4939]: E0318 17:44:40.134278 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:53 crc kubenswrapper[4939]: I0318 17:44:53.133113 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:44:53 crc kubenswrapper[4939]: E0318 17:44:53.133874 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:44:55 crc kubenswrapper[4939]: I0318 17:44:55.197721 4939 generic.go:334] "Generic (PLEG): container finished" podID="efc5f76b-7c2a-439b-b176-cc74498f19c4" containerID="8eab13e006c275619e6820724c2a82710e212a093221c7476d4731aee78bd8f4" exitCode=0 Mar 18 17:44:55 crc kubenswrapper[4939]: I0318 17:44:55.198198 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" event={"ID":"efc5f76b-7c2a-439b-b176-cc74498f19c4","Type":"ContainerDied","Data":"8eab13e006c275619e6820724c2a82710e212a093221c7476d4731aee78bd8f4"} Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.782302 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.964620 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") pod \"efc5f76b-7c2a-439b-b176-cc74498f19c4\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.964919 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9mr\" (UniqueName: \"kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr\") pod \"efc5f76b-7c2a-439b-b176-cc74498f19c4\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.964943 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory\") pod \"efc5f76b-7c2a-439b-b176-cc74498f19c4\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.965082 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph\") pod \"efc5f76b-7c2a-439b-b176-cc74498f19c4\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.970456 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr" (OuterVolumeSpecName: "kube-api-access-5t9mr") pod "efc5f76b-7c2a-439b-b176-cc74498f19c4" (UID: "efc5f76b-7c2a-439b-b176-cc74498f19c4"). InnerVolumeSpecName "kube-api-access-5t9mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:44:56 crc kubenswrapper[4939]: I0318 17:44:56.978768 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph" (OuterVolumeSpecName: "ceph") pod "efc5f76b-7c2a-439b-b176-cc74498f19c4" (UID: "efc5f76b-7c2a-439b-b176-cc74498f19c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:57 crc kubenswrapper[4939]: E0318 17:44:57.004015 4939 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1 podName:efc5f76b-7c2a-439b-b176-cc74498f19c4 nodeName:}" failed. No retries permitted until 2026-03-18 17:44:57.503982632 +0000 UTC m=+7662.103170253 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-cell1" (UniqueName: "kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1") pod "efc5f76b-7c2a-439b-b176-cc74498f19c4" (UID: "efc5f76b-7c2a-439b-b176-cc74498f19c4") : error deleting /var/lib/kubelet/pods/efc5f76b-7c2a-439b-b176-cc74498f19c4/volume-subpaths: remove /var/lib/kubelet/pods/efc5f76b-7c2a-439b-b176-cc74498f19c4/volume-subpaths: no such file or directory Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.006980 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory" (OuterVolumeSpecName: "inventory") pod "efc5f76b-7c2a-439b-b176-cc74498f19c4" (UID: "efc5f76b-7c2a-439b-b176-cc74498f19c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.067695 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t9mr\" (UniqueName: \"kubernetes.io/projected/efc5f76b-7c2a-439b-b176-cc74498f19c4-kube-api-access-5t9mr\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.067749 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.067768 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.217934 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" event={"ID":"efc5f76b-7c2a-439b-b176-cc74498f19c4","Type":"ContainerDied","Data":"01236455aa2a0dcf7f8cf8cdde4ecfdc0589476d9f86591fca0d10592c74a930"} Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.217984 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01236455aa2a0dcf7f8cf8cdde4ecfdc0589476d9f86591fca0d10592c74a930" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.218053 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-8x8pz" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.317044 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-64x5r"] Mar 18 17:44:57 crc kubenswrapper[4939]: E0318 17:44:57.317797 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc5f76b-7c2a-439b-b176-cc74498f19c4" containerName="configure-os-openstack-openstack-cell1" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.317869 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc5f76b-7c2a-439b-b176-cc74498f19c4" containerName="configure-os-openstack-openstack-cell1" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.318124 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc5f76b-7c2a-439b-b176-cc74498f19c4" containerName="configure-os-openstack-openstack-cell1" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.319070 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.333995 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-64x5r"] Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.476624 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.476672 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.476794 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.476940 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4z9\" (UniqueName: \"kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.578500 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") pod \"efc5f76b-7c2a-439b-b176-cc74498f19c4\" (UID: \"efc5f76b-7c2a-439b-b176-cc74498f19c4\") " Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.579038 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.579099 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4z9\" (UniqueName: \"kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.579256 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.579286 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.582868 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "efc5f76b-7c2a-439b-b176-cc74498f19c4" (UID: "efc5f76b-7c2a-439b-b176-cc74498f19c4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.583747 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.583912 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.593161 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.609294 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4z9\" (UniqueName: \"kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9\") pod \"ssh-known-hosts-openstack-64x5r\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.681095 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efc5f76b-7c2a-439b-b176-cc74498f19c4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:44:57 crc kubenswrapper[4939]: I0318 17:44:57.683457 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:44:58 crc kubenswrapper[4939]: I0318 17:44:58.286983 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-64x5r"] Mar 18 17:44:58 crc kubenswrapper[4939]: I0318 17:44:58.303610 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:44:59 crc kubenswrapper[4939]: I0318 17:44:59.245677 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-64x5r" event={"ID":"89d16573-9f34-4527-9799-f23dd1e42898","Type":"ContainerStarted","Data":"97e41a252805474313a54c0c6b2df07b8189ade4b02ce3d8f911efd16f3df71b"} Mar 18 17:44:59 crc kubenswrapper[4939]: I0318 17:44:59.246030 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-64x5r" event={"ID":"89d16573-9f34-4527-9799-f23dd1e42898","Type":"ContainerStarted","Data":"5c657f1008103bebff3c11618884207adf6694fe439212edeb83f5a8de095c59"} Mar 18 17:44:59 crc kubenswrapper[4939]: I0318 17:44:59.274558 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-64x5r" podStartSLOduration=2.083189257 podStartE2EDuration="2.274533036s" podCreationTimestamp="2026-03-18 17:44:57 +0000 UTC" firstStartedPulling="2026-03-18 17:44:58.30337804 +0000 UTC m=+7662.902565661" lastFinishedPulling="2026-03-18 17:44:58.494721819 +0000 UTC m=+7663.093909440" observedRunningTime="2026-03-18 17:44:59.262807911 +0000 UTC m=+7663.861995572" watchObservedRunningTime="2026-03-18 17:44:59.274533036 +0000 UTC m=+7663.873720677" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.157231 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7"] Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.158950 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.161729 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.162399 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.173336 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7"] Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.243629 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7szq\" (UniqueName: \"kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.243706 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.243890 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.350019 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.350118 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7szq\" (UniqueName: \"kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.350163 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.351700 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.375381 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.387408 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7szq\" (UniqueName: \"kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq\") pod \"collect-profiles-29564265-k4tj7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:00 crc kubenswrapper[4939]: I0318 17:45:00.509286 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:01 crc kubenswrapper[4939]: I0318 17:45:01.345015 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7"] Mar 18 17:45:02 crc kubenswrapper[4939]: I0318 17:45:02.284148 4939 generic.go:334] "Generic (PLEG): container finished" podID="ba3a9e8d-1669-476a-9372-3575f9bfa4d7" containerID="1274e5e75619c53ee883310133a6f9c62f31eee55715afe1e2486ac7b8766915" exitCode=0 Mar 18 17:45:02 crc kubenswrapper[4939]: I0318 17:45:02.284601 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" event={"ID":"ba3a9e8d-1669-476a-9372-3575f9bfa4d7","Type":"ContainerDied","Data":"1274e5e75619c53ee883310133a6f9c62f31eee55715afe1e2486ac7b8766915"} Mar 18 17:45:02 crc kubenswrapper[4939]: I0318 17:45:02.284812 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" event={"ID":"ba3a9e8d-1669-476a-9372-3575f9bfa4d7","Type":"ContainerStarted","Data":"24efb4af25bb3b1693657a8ebfac79084d512c7ed3e336aa6b66cc431ccbdee4"} Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.714634 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.827402 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume\") pod \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.827543 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7szq\" (UniqueName: \"kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq\") pod \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.827576 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume\") pod \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\" (UID: \"ba3a9e8d-1669-476a-9372-3575f9bfa4d7\") " Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.828429 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba3a9e8d-1669-476a-9372-3575f9bfa4d7" (UID: "ba3a9e8d-1669-476a-9372-3575f9bfa4d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.832947 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq" (OuterVolumeSpecName: "kube-api-access-l7szq") pod "ba3a9e8d-1669-476a-9372-3575f9bfa4d7" (UID: "ba3a9e8d-1669-476a-9372-3575f9bfa4d7"). InnerVolumeSpecName "kube-api-access-l7szq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.843413 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba3a9e8d-1669-476a-9372-3575f9bfa4d7" (UID: "ba3a9e8d-1669-476a-9372-3575f9bfa4d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.930604 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.930664 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7szq\" (UniqueName: \"kubernetes.io/projected/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-kube-api-access-l7szq\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:03 crc kubenswrapper[4939]: I0318 17:45:03.930679 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3a9e8d-1669-476a-9372-3575f9bfa4d7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:04 crc kubenswrapper[4939]: I0318 17:45:04.303417 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" event={"ID":"ba3a9e8d-1669-476a-9372-3575f9bfa4d7","Type":"ContainerDied","Data":"24efb4af25bb3b1693657a8ebfac79084d512c7ed3e336aa6b66cc431ccbdee4"} Mar 18 17:45:04 crc kubenswrapper[4939]: I0318 17:45:04.303878 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24efb4af25bb3b1693657a8ebfac79084d512c7ed3e336aa6b66cc431ccbdee4" Mar 18 17:45:04 crc kubenswrapper[4939]: I0318 17:45:04.303491 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7" Mar 18 17:45:04 crc kubenswrapper[4939]: I0318 17:45:04.824720 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558"] Mar 18 17:45:04 crc kubenswrapper[4939]: I0318 17:45:04.838059 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-gp558"] Mar 18 17:45:06 crc kubenswrapper[4939]: I0318 17:45:06.149911 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1592467f-5d46-42b1-afe7-a9173a8f4de5" path="/var/lib/kubelet/pods/1592467f-5d46-42b1-afe7-a9173a8f4de5/volumes" Mar 18 17:45:07 crc kubenswrapper[4939]: I0318 17:45:07.134065 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:45:08 crc kubenswrapper[4939]: I0318 17:45:08.350711 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343"} Mar 18 17:45:08 crc kubenswrapper[4939]: I0318 17:45:08.354520 4939 generic.go:334] "Generic (PLEG): container finished" podID="89d16573-9f34-4527-9799-f23dd1e42898" containerID="97e41a252805474313a54c0c6b2df07b8189ade4b02ce3d8f911efd16f3df71b" exitCode=0 Mar 18 17:45:08 crc kubenswrapper[4939]: I0318 17:45:08.354549 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-64x5r" event={"ID":"89d16573-9f34-4527-9799-f23dd1e42898","Type":"ContainerDied","Data":"97e41a252805474313a54c0c6b2df07b8189ade4b02ce3d8f911efd16f3df71b"} Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.856326 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.975720 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4z9\" (UniqueName: \"kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9\") pod \"89d16573-9f34-4527-9799-f23dd1e42898\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.976170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1\") pod \"89d16573-9f34-4527-9799-f23dd1e42898\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.976368 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph\") pod \"89d16573-9f34-4527-9799-f23dd1e42898\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.976403 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0\") pod \"89d16573-9f34-4527-9799-f23dd1e42898\" (UID: \"89d16573-9f34-4527-9799-f23dd1e42898\") " Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.982022 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9" (OuterVolumeSpecName: "kube-api-access-nn4z9") pod "89d16573-9f34-4527-9799-f23dd1e42898" (UID: "89d16573-9f34-4527-9799-f23dd1e42898"). InnerVolumeSpecName "kube-api-access-nn4z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:45:09 crc kubenswrapper[4939]: I0318 17:45:09.982733 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph" (OuterVolumeSpecName: "ceph") pod "89d16573-9f34-4527-9799-f23dd1e42898" (UID: "89d16573-9f34-4527-9799-f23dd1e42898"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.004737 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "89d16573-9f34-4527-9799-f23dd1e42898" (UID: "89d16573-9f34-4527-9799-f23dd1e42898"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.011029 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "89d16573-9f34-4527-9799-f23dd1e42898" (UID: "89d16573-9f34-4527-9799-f23dd1e42898"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.079135 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4z9\" (UniqueName: \"kubernetes.io/projected/89d16573-9f34-4527-9799-f23dd1e42898-kube-api-access-nn4z9\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.079189 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.079199 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.079209 4939 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/89d16573-9f34-4527-9799-f23dd1e42898-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.378923 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-64x5r" event={"ID":"89d16573-9f34-4527-9799-f23dd1e42898","Type":"ContainerDied","Data":"5c657f1008103bebff3c11618884207adf6694fe439212edeb83f5a8de095c59"} Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.378973 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c657f1008103bebff3c11618884207adf6694fe439212edeb83f5a8de095c59" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.379039 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-64x5r" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.482774 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8s4r2"] Mar 18 17:45:10 crc kubenswrapper[4939]: E0318 17:45:10.483188 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d16573-9f34-4527-9799-f23dd1e42898" containerName="ssh-known-hosts-openstack" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.483203 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d16573-9f34-4527-9799-f23dd1e42898" containerName="ssh-known-hosts-openstack" Mar 18 17:45:10 crc kubenswrapper[4939]: E0318 17:45:10.483224 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3a9e8d-1669-476a-9372-3575f9bfa4d7" containerName="collect-profiles" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.483230 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3a9e8d-1669-476a-9372-3575f9bfa4d7" containerName="collect-profiles" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.483431 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3a9e8d-1669-476a-9372-3575f9bfa4d7" containerName="collect-profiles" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.483453 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d16573-9f34-4527-9799-f23dd1e42898" containerName="ssh-known-hosts-openstack" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.484237 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.487110 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.487319 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.487436 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.490408 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.493992 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8s4r2"] Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.591756 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.591884 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.591926 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.591990 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6wk\" (UniqueName: \"kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.694005 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.694515 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.694561 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.694616 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6wk\" (UniqueName: \"kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.699490 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.708872 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.713365 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.715600 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6wk\" (UniqueName: \"kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk\") pod \"run-os-openstack-openstack-cell1-8s4r2\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.786820 4939 scope.go:117] "RemoveContainer" containerID="7b54f82f7763ad1e4ea85883a44a6a499162fee9458eeddfd28a47840b7b1e2e" Mar 18 17:45:10 crc kubenswrapper[4939]: I0318 17:45:10.806737 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:11 crc kubenswrapper[4939]: I0318 17:45:11.417093 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8s4r2"] Mar 18 17:45:11 crc kubenswrapper[4939]: W0318 17:45:11.422732 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5794a5_e3f0_4ef0_96e7_714f95fcd1c7.slice/crio-b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8 WatchSource:0}: Error finding container b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8: Status 404 returned error can't find the container with id b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8 Mar 18 17:45:12 crc kubenswrapper[4939]: I0318 17:45:12.400194 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" event={"ID":"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7","Type":"ContainerStarted","Data":"c8d0c8d95d01b87d3051928ab079a93ffdc30c5fe74f9eeab794dbe9fccfca37"} Mar 18 17:45:12 crc kubenswrapper[4939]: I0318 17:45:12.400721 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" event={"ID":"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7","Type":"ContainerStarted","Data":"b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8"} Mar 18 17:45:12 crc kubenswrapper[4939]: I0318 17:45:12.418641 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" podStartSLOduration=2.226825706 podStartE2EDuration="2.418622679s" podCreationTimestamp="2026-03-18 17:45:10 +0000 UTC" firstStartedPulling="2026-03-18 17:45:11.426719907 +0000 UTC m=+7676.025907528" lastFinishedPulling="2026-03-18 17:45:11.61851688 +0000 UTC m=+7676.217704501" observedRunningTime="2026-03-18 17:45:12.414772989 +0000 UTC m=+7677.013960620" watchObservedRunningTime="2026-03-18 17:45:12.418622679 +0000 UTC m=+7677.017810300" Mar 18 17:45:19 crc kubenswrapper[4939]: I0318 17:45:19.472309 4939 generic.go:334] "Generic (PLEG): container finished" podID="0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" containerID="c8d0c8d95d01b87d3051928ab079a93ffdc30c5fe74f9eeab794dbe9fccfca37" exitCode=0 Mar 18 17:45:19 crc kubenswrapper[4939]: I0318 17:45:19.472376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" event={"ID":"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7","Type":"ContainerDied","Data":"c8d0c8d95d01b87d3051928ab079a93ffdc30c5fe74f9eeab794dbe9fccfca37"} Mar 18 17:45:20 crc kubenswrapper[4939]: I0318 17:45:20.938007 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.044017 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph\") pod \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.044115 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm6wk\" (UniqueName: \"kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk\") pod \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.044163 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory\") pod \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.044282 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1\") pod \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\" (UID: \"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7\") " Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.050943 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk" (OuterVolumeSpecName: "kube-api-access-hm6wk") pod "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" (UID: "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7"). InnerVolumeSpecName "kube-api-access-hm6wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.051045 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph" (OuterVolumeSpecName: "ceph") pod "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" (UID: "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.074989 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory" (OuterVolumeSpecName: "inventory") pod "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" (UID: "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.085025 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" (UID: "0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.147125 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.147168 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm6wk\" (UniqueName: \"kubernetes.io/projected/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-kube-api-access-hm6wk\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.147180 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.147190 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.499438 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" event={"ID":"0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7","Type":"ContainerDied","Data":"b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8"} Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.499488 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3ab5f6cb845c36ad7c1913ef099bd513e3b5b4573a05a503ff5028c2c6f6ba8" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.499570 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8s4r2" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.618178 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-pc2wp"] Mar 18 17:45:21 crc kubenswrapper[4939]: E0318 17:45:21.619274 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" containerName="run-os-openstack-openstack-cell1" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.619296 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" containerName="run-os-openstack-openstack-cell1" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.619767 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7" containerName="run-os-openstack-openstack-cell1" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.621214 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.624792 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.625003 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.625028 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.625196 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.639980 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-pc2wp"] Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.776279 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.776394 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.776635 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.777030 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkrkq\" (UniqueName: \"kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.879406 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.879491 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.879601 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.879724 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkrkq\" (UniqueName: \"kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.883813 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.883832 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.885250 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.896696 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkrkq\" (UniqueName: \"kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq\") pod \"reboot-os-openstack-openstack-cell1-pc2wp\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:21 crc kubenswrapper[4939]: I0318 17:45:21.948598 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:22 crc kubenswrapper[4939]: I0318 17:45:22.574361 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-pc2wp"] Mar 18 17:45:22 crc kubenswrapper[4939]: W0318 17:45:22.574496 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad72a8b9_9736_4cc8_9e5a_1a99fa9c10ec.slice/crio-6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec WatchSource:0}: Error finding container 6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec: Status 404 returned error can't find the container with id 6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec Mar 18 17:45:23 crc kubenswrapper[4939]: I0318 17:45:23.521639 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" event={"ID":"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec","Type":"ContainerStarted","Data":"20897f09692c3df74f49e019c3efd0182430f6d6ea5efe6e3552689200a2edbb"} Mar 18 17:45:23 crc kubenswrapper[4939]: I0318 17:45:23.522091 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" event={"ID":"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec","Type":"ContainerStarted","Data":"6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec"} Mar 18 17:45:23 crc kubenswrapper[4939]: I0318 17:45:23.551821 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" podStartSLOduration=2.405851499 podStartE2EDuration="2.551796944s" podCreationTimestamp="2026-03-18 17:45:21 +0000 UTC" firstStartedPulling="2026-03-18 17:45:22.576714316 +0000 UTC m=+7687.175901937" lastFinishedPulling="2026-03-18 17:45:22.722659721 +0000 UTC m=+7687.321847382" observedRunningTime="2026-03-18 17:45:23.538651535 +0000 UTC m=+7688.137839176" watchObservedRunningTime="2026-03-18 17:45:23.551796944 +0000 UTC m=+7688.150984565" Mar 18 17:45:40 crc kubenswrapper[4939]: I0318 17:45:40.716449 4939 generic.go:334] "Generic (PLEG): container finished" podID="ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" containerID="20897f09692c3df74f49e019c3efd0182430f6d6ea5efe6e3552689200a2edbb" exitCode=0 Mar 18 17:45:40 crc kubenswrapper[4939]: I0318 17:45:40.717108 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" event={"ID":"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec","Type":"ContainerDied","Data":"20897f09692c3df74f49e019c3efd0182430f6d6ea5efe6e3552689200a2edbb"} Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.757866 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.768708 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" event={"ID":"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec","Type":"ContainerDied","Data":"6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec"} Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.768763 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8cd0e04c947422829f0242b5e5e363cc6a413cc1f2094961553b366abbcfec" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.768770 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-pc2wp" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.900333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1\") pod \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.901826 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkrkq\" (UniqueName: \"kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq\") pod \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.901932 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory\") pod \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.902150 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph\") pod \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\" (UID: \"ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec\") " Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.907908 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph" (OuterVolumeSpecName: "ceph") pod "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" (UID: "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.909127 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq" (OuterVolumeSpecName: "kube-api-access-lkrkq") pod "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" (UID: "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec"). InnerVolumeSpecName "kube-api-access-lkrkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.934878 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" (UID: "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:42 crc kubenswrapper[4939]: I0318 17:45:42.935166 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory" (OuterVolumeSpecName: "inventory") pod "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" (UID: "ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.005724 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkrkq\" (UniqueName: \"kubernetes.io/projected/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-kube-api-access-lkrkq\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.005893 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.005981 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.006033 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.934292 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-49s8g"] Mar 18 17:45:43 crc kubenswrapper[4939]: E0318 17:45:43.935801 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" containerName="reboot-os-openstack-openstack-cell1" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.935831 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" containerName="reboot-os-openstack-openstack-cell1" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.936134 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec" containerName="reboot-os-openstack-openstack-cell1" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.937517 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.940199 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.943889 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.944353 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.945979 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-49s8g"] Mar 18 17:45:43 crc kubenswrapper[4939]: I0318 17:45:43.946886 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027622 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027689 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027795 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027845 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027888 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027931 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.027957 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.028000 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.028052 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.028091 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtdv5\" (UniqueName: \"kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.028127 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.028417 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130541 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130607 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130667 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130703 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtdv5\" (UniqueName: \"kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130736 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130814 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130854 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130887 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.130966 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.131019 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.131058 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.131098 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.137979 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.144196 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.146401 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.146559 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.146585 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.146901 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.146970 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.147128 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.147925 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.148275 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.153955 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtdv5\" (UniqueName: \"kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.154121 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-49s8g\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.266416 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:45:44 crc kubenswrapper[4939]: I0318 17:45:44.823686 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-49s8g"] Mar 18 17:45:45 crc kubenswrapper[4939]: I0318 17:45:45.801561 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" event={"ID":"d9729542-303e-47cd-b312-6e3bc1cd7b51","Type":"ContainerStarted","Data":"4ac68a5e9d94321bfc2076f6060155d9d02457c12347090e3ef3ea5cef5b7d4d"} Mar 18 17:45:45 crc kubenswrapper[4939]: I0318 17:45:45.801881 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" event={"ID":"d9729542-303e-47cd-b312-6e3bc1cd7b51","Type":"ContainerStarted","Data":"96962ccf39576d5fb499be0b50956a36125f6d6a7ffa7655fb4e10a0d493b4e7"} Mar 18 17:45:45 crc kubenswrapper[4939]: I0318 17:45:45.836572 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" podStartSLOduration=2.68026246 podStartE2EDuration="2.836550307s" podCreationTimestamp="2026-03-18 17:45:43 +0000 UTC" firstStartedPulling="2026-03-18 17:45:44.828177481 +0000 UTC m=+7709.427365102" lastFinishedPulling="2026-03-18 17:45:44.984465318 +0000 UTC m=+7709.583652949" observedRunningTime="2026-03-18 17:45:45.828156355 +0000 UTC m=+7710.427344016" watchObservedRunningTime="2026-03-18 17:45:45.836550307 +0000 UTC m=+7710.435737948" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.149367 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564266-n6pzl"] Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.153409 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.156801 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.158247 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.161689 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.168312 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564266-n6pzl"] Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.203585 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7php\" (UniqueName: \"kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php\") pod \"auto-csr-approver-29564266-n6pzl\" (UID: \"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531\") " pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.306045 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7php\" (UniqueName: \"kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php\") pod \"auto-csr-approver-29564266-n6pzl\" (UID: \"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531\") " pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.337636 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7php\" (UniqueName: \"kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php\") pod \"auto-csr-approver-29564266-n6pzl\" (UID: \"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531\") " pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:00 crc kubenswrapper[4939]: I0318 17:46:00.484143 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:01 crc kubenswrapper[4939]: I0318 17:46:01.028664 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564266-n6pzl"] Mar 18 17:46:01 crc kubenswrapper[4939]: I0318 17:46:01.969354 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" event={"ID":"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531","Type":"ContainerStarted","Data":"368e239986b441f2d1a403dbf2b57f7ed072bb077f727e62468b91e291698176"} Mar 18 17:46:02 crc kubenswrapper[4939]: I0318 17:46:02.983591 4939 generic.go:334] "Generic (PLEG): container finished" podID="cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" containerID="92dcf0dcef797f1e709f39ae9d28139aae39464bbbf4a847f1297d28abbf8300" exitCode=0 Mar 18 17:46:02 crc kubenswrapper[4939]: I0318 17:46:02.983711 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" event={"ID":"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531","Type":"ContainerDied","Data":"92dcf0dcef797f1e709f39ae9d28139aae39464bbbf4a847f1297d28abbf8300"} Mar 18 17:46:03 crc kubenswrapper[4939]: I0318 17:46:03.996050 4939 generic.go:334] "Generic (PLEG): container finished" podID="d9729542-303e-47cd-b312-6e3bc1cd7b51" containerID="4ac68a5e9d94321bfc2076f6060155d9d02457c12347090e3ef3ea5cef5b7d4d" exitCode=0 Mar 18 17:46:03 crc kubenswrapper[4939]: I0318 17:46:03.996129 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" event={"ID":"d9729542-303e-47cd-b312-6e3bc1cd7b51","Type":"ContainerDied","Data":"4ac68a5e9d94321bfc2076f6060155d9d02457c12347090e3ef3ea5cef5b7d4d"} Mar 18 17:46:04 crc kubenswrapper[4939]: I0318 17:46:04.367943 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:04 crc kubenswrapper[4939]: I0318 17:46:04.507803 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7php\" (UniqueName: \"kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php\") pod \"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531\" (UID: \"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531\") " Mar 18 17:46:04 crc kubenswrapper[4939]: I0318 17:46:04.513666 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php" (OuterVolumeSpecName: "kube-api-access-j7php") pod "cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" (UID: "cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531"). InnerVolumeSpecName "kube-api-access-j7php". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:46:04 crc kubenswrapper[4939]: I0318 17:46:04.610567 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7php\" (UniqueName: \"kubernetes.io/projected/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531-kube-api-access-j7php\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.007969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" event={"ID":"cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531","Type":"ContainerDied","Data":"368e239986b441f2d1a403dbf2b57f7ed072bb077f727e62468b91e291698176"} Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.008030 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="368e239986b441f2d1a403dbf2b57f7ed072bb077f727e62468b91e291698176" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.008109 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564266-n6pzl" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.433913 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564260-6jszz"] Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.444239 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564260-6jszz"] Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.477943 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.525975 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526130 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526157 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526212 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526233 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526258 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526275 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526292 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526354 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526414 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526447 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.526481 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtdv5\" (UniqueName: \"kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5\") pod \"d9729542-303e-47cd-b312-6e3bc1cd7b51\" (UID: \"d9729542-303e-47cd-b312-6e3bc1cd7b51\") " Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.531535 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.531600 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5" (OuterVolumeSpecName: "kube-api-access-mtdv5") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "kube-api-access-mtdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.531687 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.531918 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph" (OuterVolumeSpecName: "ceph") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.532226 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.533636 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.534920 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.534979 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.540145 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.553104 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.571515 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.580270 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory" (OuterVolumeSpecName: "inventory") pod "d9729542-303e-47cd-b312-6e3bc1cd7b51" (UID: "d9729542-303e-47cd-b312-6e3bc1cd7b51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630126 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630431 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtdv5\" (UniqueName: \"kubernetes.io/projected/d9729542-303e-47cd-b312-6e3bc1cd7b51-kube-api-access-mtdv5\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630447 4939 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630459 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630473 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630485 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630535 4939 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630550 4939 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630564 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630575 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630586 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:05 crc kubenswrapper[4939]: I0318 17:46:05.630629 4939 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9729542-303e-47cd-b312-6e3bc1cd7b51-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.018406 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" event={"ID":"d9729542-303e-47cd-b312-6e3bc1cd7b51","Type":"ContainerDied","Data":"96962ccf39576d5fb499be0b50956a36125f6d6a7ffa7655fb4e10a0d493b4e7"} Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.018458 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96962ccf39576d5fb499be0b50956a36125f6d6a7ffa7655fb4e10a0d493b4e7" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.018472 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-49s8g" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.144575 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43ce6b1-fa53-4c37-b049-2057fcc2542f" path="/var/lib/kubelet/pods/e43ce6b1-fa53-4c37-b049-2057fcc2542f/volumes" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.207663 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2fzgn"] Mar 18 17:46:06 crc kubenswrapper[4939]: E0318 17:46:06.208129 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9729542-303e-47cd-b312-6e3bc1cd7b51" containerName="install-certs-openstack-openstack-cell1" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.208146 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9729542-303e-47cd-b312-6e3bc1cd7b51" containerName="install-certs-openstack-openstack-cell1" Mar 18 17:46:06 crc kubenswrapper[4939]: E0318 17:46:06.208166 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" containerName="oc" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.208175 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" containerName="oc" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.208388 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" containerName="oc" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.208405 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9729542-303e-47cd-b312-6e3bc1cd7b51" containerName="install-certs-openstack-openstack-cell1" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.209153 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.211127 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.211148 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.211356 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.212986 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.222180 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2fzgn"] Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.345474 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.345704 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4xh\" (UniqueName: \"kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.346082 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.346285 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.448254 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4xh\" (UniqueName: \"kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.448461 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.448693 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.448782 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.454251 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.456163 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.456415 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.484653 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4xh\" (UniqueName: \"kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh\") pod \"ceph-client-openstack-openstack-cell1-2fzgn\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:06 crc kubenswrapper[4939]: I0318 17:46:06.529529 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:07 crc kubenswrapper[4939]: I0318 17:46:07.081360 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2fzgn"] Mar 18 17:46:07 crc kubenswrapper[4939]: W0318 17:46:07.082407 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod371f7f00_12c5_423a_aa1b_a6c713d52961.slice/crio-1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42 WatchSource:0}: Error finding container 1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42: Status 404 returned error can't find the container with id 1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42 Mar 18 17:46:08 crc kubenswrapper[4939]: I0318 17:46:08.037286 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" event={"ID":"371f7f00-12c5-423a-aa1b-a6c713d52961","Type":"ContainerStarted","Data":"9a62c6bf8393cd2bf2faeddde56f5b3d59d57b451c66465f12059c11ae23394d"} Mar 18 17:46:08 crc kubenswrapper[4939]: I0318 17:46:08.037994 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" event={"ID":"371f7f00-12c5-423a-aa1b-a6c713d52961","Type":"ContainerStarted","Data":"1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42"} Mar 18 17:46:08 crc kubenswrapper[4939]: I0318 17:46:08.056050 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" podStartSLOduration=1.886448946 podStartE2EDuration="2.056030888s" podCreationTimestamp="2026-03-18 17:46:06 +0000 UTC" firstStartedPulling="2026-03-18 17:46:07.084956744 +0000 UTC m=+7731.684144365" lastFinishedPulling="2026-03-18 17:46:07.254538686 +0000 UTC m=+7731.853726307" observedRunningTime="2026-03-18 17:46:08.052726145 +0000 UTC m=+7732.651913776" watchObservedRunningTime="2026-03-18 17:46:08.056030888 +0000 UTC m=+7732.655218509" Mar 18 17:46:10 crc kubenswrapper[4939]: I0318 17:46:10.869586 4939 scope.go:117] "RemoveContainer" containerID="0a462c6e4ca3a134fa034fe2473fa84036804ec4678b3f7ec873fcfe1407905a" Mar 18 17:46:13 crc kubenswrapper[4939]: I0318 17:46:13.089455 4939 generic.go:334] "Generic (PLEG): container finished" podID="371f7f00-12c5-423a-aa1b-a6c713d52961" containerID="9a62c6bf8393cd2bf2faeddde56f5b3d59d57b451c66465f12059c11ae23394d" exitCode=0 Mar 18 17:46:13 crc kubenswrapper[4939]: I0318 17:46:13.090976 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" event={"ID":"371f7f00-12c5-423a-aa1b-a6c713d52961","Type":"ContainerDied","Data":"9a62c6bf8393cd2bf2faeddde56f5b3d59d57b451c66465f12059c11ae23394d"} Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.602170 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.659922 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1\") pod \"371f7f00-12c5-423a-aa1b-a6c713d52961\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.660008 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory\") pod \"371f7f00-12c5-423a-aa1b-a6c713d52961\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.660087 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h4xh\" (UniqueName: \"kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh\") pod \"371f7f00-12c5-423a-aa1b-a6c713d52961\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.660177 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph\") pod \"371f7f00-12c5-423a-aa1b-a6c713d52961\" (UID: \"371f7f00-12c5-423a-aa1b-a6c713d52961\") " Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.668693 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph" (OuterVolumeSpecName: "ceph") pod "371f7f00-12c5-423a-aa1b-a6c713d52961" (UID: "371f7f00-12c5-423a-aa1b-a6c713d52961"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.678925 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh" (OuterVolumeSpecName: "kube-api-access-7h4xh") pod "371f7f00-12c5-423a-aa1b-a6c713d52961" (UID: "371f7f00-12c5-423a-aa1b-a6c713d52961"). InnerVolumeSpecName "kube-api-access-7h4xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.700622 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "371f7f00-12c5-423a-aa1b-a6c713d52961" (UID: "371f7f00-12c5-423a-aa1b-a6c713d52961"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.706960 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory" (OuterVolumeSpecName: "inventory") pod "371f7f00-12c5-423a-aa1b-a6c713d52961" (UID: "371f7f00-12c5-423a-aa1b-a6c713d52961"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.763202 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.763238 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.763331 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/371f7f00-12c5-423a-aa1b-a6c713d52961-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:14 crc kubenswrapper[4939]: I0318 17:46:14.763371 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h4xh\" (UniqueName: \"kubernetes.io/projected/371f7f00-12c5-423a-aa1b-a6c713d52961-kube-api-access-7h4xh\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.118977 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" event={"ID":"371f7f00-12c5-423a-aa1b-a6c713d52961","Type":"ContainerDied","Data":"1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42"} Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.119218 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7b92165b156c210f0aaecfcd916192f199da9b2d6f45a49dba10227c0aba42" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.119038 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2fzgn" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.226681 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hpml8"] Mar 18 17:46:15 crc kubenswrapper[4939]: E0318 17:46:15.227321 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="371f7f00-12c5-423a-aa1b-a6c713d52961" containerName="ceph-client-openstack-openstack-cell1" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.227350 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="371f7f00-12c5-423a-aa1b-a6c713d52961" containerName="ceph-client-openstack-openstack-cell1" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.227682 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="371f7f00-12c5-423a-aa1b-a6c713d52961" containerName="ceph-client-openstack-openstack-cell1" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.228910 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.232462 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.232960 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.232980 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.233036 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.233856 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.251219 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hpml8"] Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.281380 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.281895 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.281995 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.282168 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.282368 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhwx8\" (UniqueName: \"kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.282514 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384354 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384435 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384497 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384533 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384596 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.384667 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwx8\" (UniqueName: \"kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.385810 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.396099 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.397076 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.397464 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.397924 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.405198 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwx8\" (UniqueName: \"kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8\") pod \"ovn-openstack-openstack-cell1-hpml8\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:15 crc kubenswrapper[4939]: I0318 17:46:15.588158 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:46:16 crc kubenswrapper[4939]: I0318 17:46:16.128599 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-hpml8"] Mar 18 17:46:16 crc kubenswrapper[4939]: I0318 17:46:16.335820 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:46:17 crc kubenswrapper[4939]: I0318 17:46:17.143464 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hpml8" event={"ID":"6a1b5c03-995f-4ca7-bec5-622b83855a6c","Type":"ContainerStarted","Data":"495aa79e3d3e22d87059de0deaadbcf5fd20b1bdf04b73d3cdd865a259c56c97"} Mar 18 17:46:17 crc kubenswrapper[4939]: I0318 17:46:17.144004 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hpml8" event={"ID":"6a1b5c03-995f-4ca7-bec5-622b83855a6c","Type":"ContainerStarted","Data":"9338e9445df5cb159fb5aa377c530a5e216698e2bb5afe821e3a7ca5a0777a1b"} Mar 18 17:46:17 crc kubenswrapper[4939]: I0318 17:46:17.167891 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-hpml8" podStartSLOduration=1.970977607 podStartE2EDuration="2.167870229s" podCreationTimestamp="2026-03-18 17:46:15 +0000 UTC" firstStartedPulling="2026-03-18 17:46:16.135243618 +0000 UTC m=+7740.734431259" lastFinishedPulling="2026-03-18 17:46:16.33213626 +0000 UTC m=+7740.931323881" observedRunningTime="2026-03-18 17:46:17.160243751 +0000 UTC m=+7741.759431372" watchObservedRunningTime="2026-03-18 17:46:17.167870229 +0000 UTC m=+7741.767057850" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.146694 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.150073 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.165851 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.260674 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qrs\" (UniqueName: \"kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.261549 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.262120 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.364568 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qrs\" (UniqueName: \"kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.364755 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.364815 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.365370 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.365415 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.382624 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qrs\" (UniqueName: \"kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs\") pod \"redhat-marketplace-5gxp2\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.493998 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:18 crc kubenswrapper[4939]: I0318 17:46:18.992652 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:19 crc kubenswrapper[4939]: I0318 17:46:19.170352 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerStarted","Data":"01e020373ce9b57672d86c4355eb3c95115a25366abc811c9b316f731aa23636"} Mar 18 17:46:20 crc kubenswrapper[4939]: I0318 17:46:20.183437 4939 generic.go:334] "Generic (PLEG): container finished" podID="97068ab5-7161-435c-9bf6-44455e61f066" containerID="feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31" exitCode=0 Mar 18 17:46:20 crc kubenswrapper[4939]: I0318 17:46:20.183514 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerDied","Data":"feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31"} Mar 18 17:46:21 crc kubenswrapper[4939]: I0318 17:46:21.194744 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerStarted","Data":"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced"} Mar 18 17:46:22 crc kubenswrapper[4939]: I0318 17:46:22.207339 4939 generic.go:334] "Generic (PLEG): container finished" podID="97068ab5-7161-435c-9bf6-44455e61f066" containerID="cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced" exitCode=0 Mar 18 17:46:22 crc kubenswrapper[4939]: I0318 17:46:22.207388 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerDied","Data":"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced"} Mar 18 17:46:23 crc kubenswrapper[4939]: I0318 17:46:23.223161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerStarted","Data":"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf"} Mar 18 17:46:23 crc kubenswrapper[4939]: I0318 17:46:23.249356 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5gxp2" podStartSLOduration=2.642520024 podStartE2EDuration="5.249334402s" podCreationTimestamp="2026-03-18 17:46:18 +0000 UTC" firstStartedPulling="2026-03-18 17:46:20.186380479 +0000 UTC m=+7744.785568100" lastFinishedPulling="2026-03-18 17:46:22.793194857 +0000 UTC m=+7747.392382478" observedRunningTime="2026-03-18 17:46:23.247945309 +0000 UTC m=+7747.847132940" watchObservedRunningTime="2026-03-18 17:46:23.249334402 +0000 UTC m=+7747.848522023" Mar 18 17:46:28 crc kubenswrapper[4939]: I0318 17:46:28.494753 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:28 crc kubenswrapper[4939]: I0318 17:46:28.495403 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:28 crc kubenswrapper[4939]: I0318 17:46:28.568990 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:29 crc kubenswrapper[4939]: I0318 17:46:29.340916 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:29 crc kubenswrapper[4939]: I0318 17:46:29.392750 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.309390 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5gxp2" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="registry-server" containerID="cri-o://665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf" gracePeriod=2 Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.856173 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.973190 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities\") pod \"97068ab5-7161-435c-9bf6-44455e61f066\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.973379 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content\") pod \"97068ab5-7161-435c-9bf6-44455e61f066\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.973645 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8qrs\" (UniqueName: \"kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs\") pod \"97068ab5-7161-435c-9bf6-44455e61f066\" (UID: \"97068ab5-7161-435c-9bf6-44455e61f066\") " Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.974441 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities" (OuterVolumeSpecName: "utilities") pod "97068ab5-7161-435c-9bf6-44455e61f066" (UID: "97068ab5-7161-435c-9bf6-44455e61f066"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:46:31 crc kubenswrapper[4939]: I0318 17:46:31.980389 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs" (OuterVolumeSpecName: "kube-api-access-c8qrs") pod "97068ab5-7161-435c-9bf6-44455e61f066" (UID: "97068ab5-7161-435c-9bf6-44455e61f066"). InnerVolumeSpecName "kube-api-access-c8qrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.009194 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97068ab5-7161-435c-9bf6-44455e61f066" (UID: "97068ab5-7161-435c-9bf6-44455e61f066"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.076155 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.076208 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8qrs\" (UniqueName: \"kubernetes.io/projected/97068ab5-7161-435c-9bf6-44455e61f066-kube-api-access-c8qrs\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.076228 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97068ab5-7161-435c-9bf6-44455e61f066-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.319973 4939 generic.go:334] "Generic (PLEG): container finished" podID="97068ab5-7161-435c-9bf6-44455e61f066" containerID="665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf" exitCode=0 Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.320060 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5gxp2" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.320048 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerDied","Data":"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf"} Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.320325 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5gxp2" event={"ID":"97068ab5-7161-435c-9bf6-44455e61f066","Type":"ContainerDied","Data":"01e020373ce9b57672d86c4355eb3c95115a25366abc811c9b316f731aa23636"} Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.320355 4939 scope.go:117] "RemoveContainer" containerID="665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.354245 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.355874 4939 scope.go:117] "RemoveContainer" containerID="cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.366384 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5gxp2"] Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.376800 4939 scope.go:117] "RemoveContainer" containerID="feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.431552 4939 scope.go:117] "RemoveContainer" containerID="665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf" Mar 18 17:46:32 crc kubenswrapper[4939]: E0318 17:46:32.431913 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf\": container with ID starting with 665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf not found: ID does not exist" containerID="665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.431956 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf"} err="failed to get container status \"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf\": rpc error: code = NotFound desc = could not find container \"665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf\": container with ID starting with 665bad284d9dcb67e2f7879587408e82f81db4843ea2f2f93fef5f7f514c11cf not found: ID does not exist" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.431984 4939 scope.go:117] "RemoveContainer" containerID="cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced" Mar 18 17:46:32 crc kubenswrapper[4939]: E0318 17:46:32.432341 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced\": container with ID starting with cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced not found: ID does not exist" containerID="cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.432386 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced"} err="failed to get container status \"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced\": rpc error: code = NotFound desc = could not find container \"cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced\": container with ID starting with cf26ecf9dc6852645a65e944b24a58d7591b685fa432da3dfeb5bb3b34dc9ced not found: ID does not exist" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.432419 4939 scope.go:117] "RemoveContainer" containerID="feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31" Mar 18 17:46:32 crc kubenswrapper[4939]: E0318 17:46:32.432802 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31\": container with ID starting with feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31 not found: ID does not exist" containerID="feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31" Mar 18 17:46:32 crc kubenswrapper[4939]: I0318 17:46:32.432867 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31"} err="failed to get container status \"feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31\": rpc error: code = NotFound desc = could not find container \"feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31\": container with ID starting with feac0fd9621f55f168b0e722b299ec9d562370fdff36e4a42e68726744751f31 not found: ID does not exist" Mar 18 17:46:34 crc kubenswrapper[4939]: I0318 17:46:34.149534 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97068ab5-7161-435c-9bf6-44455e61f066" path="/var/lib/kubelet/pods/97068ab5-7161-435c-9bf6-44455e61f066/volumes" Mar 18 17:47:22 crc kubenswrapper[4939]: I0318 17:47:22.934292 4939 generic.go:334] "Generic (PLEG): container finished" podID="6a1b5c03-995f-4ca7-bec5-622b83855a6c" containerID="495aa79e3d3e22d87059de0deaadbcf5fd20b1bdf04b73d3cdd865a259c56c97" exitCode=0 Mar 18 17:47:22 crc kubenswrapper[4939]: I0318 17:47:22.934388 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hpml8" event={"ID":"6a1b5c03-995f-4ca7-bec5-622b83855a6c","Type":"ContainerDied","Data":"495aa79e3d3e22d87059de0deaadbcf5fd20b1bdf04b73d3cdd865a259c56c97"} Mar 18 17:47:23 crc kubenswrapper[4939]: I0318 17:47:23.687486 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:47:23 crc kubenswrapper[4939]: I0318 17:47:23.688001 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.479983 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.532076 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.532144 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.532211 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhwx8\" (UniqueName: \"kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.532316 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.532494 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.533253 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory\") pod \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\" (UID: \"6a1b5c03-995f-4ca7-bec5-622b83855a6c\") " Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.539063 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8" (OuterVolumeSpecName: "kube-api-access-fhwx8") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "kube-api-access-fhwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.543625 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.543677 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph" (OuterVolumeSpecName: "ceph") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.565072 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.567609 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory" (OuterVolumeSpecName: "inventory") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.596589 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6a1b5c03-995f-4ca7-bec5-622b83855a6c" (UID: "6a1b5c03-995f-4ca7-bec5-622b83855a6c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.636653 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.636934 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.636962 4939 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.636985 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.637003 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhwx8\" (UniqueName: \"kubernetes.io/projected/6a1b5c03-995f-4ca7-bec5-622b83855a6c-kube-api-access-fhwx8\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.637020 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a1b5c03-995f-4ca7-bec5-622b83855a6c-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.957698 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-hpml8" event={"ID":"6a1b5c03-995f-4ca7-bec5-622b83855a6c","Type":"ContainerDied","Data":"9338e9445df5cb159fb5aa377c530a5e216698e2bb5afe821e3a7ca5a0777a1b"} Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.957761 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-hpml8" Mar 18 17:47:24 crc kubenswrapper[4939]: I0318 17:47:24.957771 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9338e9445df5cb159fb5aa377c530a5e216698e2bb5afe821e3a7ca5a0777a1b" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.060421 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-26rfs"] Mar 18 17:47:25 crc kubenswrapper[4939]: E0318 17:47:25.061158 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="extract-content" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061190 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="extract-content" Mar 18 17:47:25 crc kubenswrapper[4939]: E0318 17:47:25.061214 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="registry-server" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061227 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="registry-server" Mar 18 17:47:25 crc kubenswrapper[4939]: E0318 17:47:25.061257 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1b5c03-995f-4ca7-bec5-622b83855a6c" containerName="ovn-openstack-openstack-cell1" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061269 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1b5c03-995f-4ca7-bec5-622b83855a6c" containerName="ovn-openstack-openstack-cell1" Mar 18 17:47:25 crc kubenswrapper[4939]: E0318 17:47:25.061287 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="extract-utilities" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061298 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="extract-utilities" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061587 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="97068ab5-7161-435c-9bf6-44455e61f066" containerName="registry-server" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.061632 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1b5c03-995f-4ca7-bec5-622b83855a6c" containerName="ovn-openstack-openstack-cell1" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.062610 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.072406 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-26rfs"] Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.105076 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.105120 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.105144 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.105494 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.105763 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.107435 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149287 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149389 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149565 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149615 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjr62\" (UniqueName: \"kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149754 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149809 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.149849 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.252411 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.252549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjr62\" (UniqueName: \"kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.252743 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.252903 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.253110 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.253157 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.253225 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.257639 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.258056 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.258121 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.258551 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.260481 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.260633 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.284134 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjr62\" (UniqueName: \"kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62\") pod \"neutron-metadata-openstack-openstack-cell1-26rfs\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.418613 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:47:25 crc kubenswrapper[4939]: I0318 17:47:25.972592 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-26rfs"] Mar 18 17:47:26 crc kubenswrapper[4939]: I0318 17:47:26.991985 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" event={"ID":"10576dfe-22f5-4937-bd11-149ca982c4f2","Type":"ContainerStarted","Data":"8b59aa919c8b493e7e96924efb6186a6bb1f1875a805a957effcd7ba60b51292"} Mar 18 17:47:26 crc kubenswrapper[4939]: I0318 17:47:26.992485 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" event={"ID":"10576dfe-22f5-4937-bd11-149ca982c4f2","Type":"ContainerStarted","Data":"97e51595990793fd89d1215b2876f446ee244d0056c12a791af556941635f59d"} Mar 18 17:47:27 crc kubenswrapper[4939]: I0318 17:47:27.035345 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" podStartSLOduration=1.820052104 podStartE2EDuration="2.03531942s" podCreationTimestamp="2026-03-18 17:47:25 +0000 UTC" firstStartedPulling="2026-03-18 17:47:25.973947061 +0000 UTC m=+7810.573134702" lastFinishedPulling="2026-03-18 17:47:26.189214377 +0000 UTC m=+7810.788402018" observedRunningTime="2026-03-18 17:47:27.025118775 +0000 UTC m=+7811.624306446" watchObservedRunningTime="2026-03-18 17:47:27.03531942 +0000 UTC m=+7811.634507041" Mar 18 17:47:53 crc kubenswrapper[4939]: I0318 17:47:53.687312 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:47:53 crc kubenswrapper[4939]: I0318 17:47:53.687875 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.148665 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564268-n8x84"] Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.150398 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.153817 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.153875 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.155211 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.161294 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564268-n8x84"] Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.261448 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd2r2\" (UniqueName: \"kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2\") pod \"auto-csr-approver-29564268-n8x84\" (UID: \"99b5bd2a-4f3c-442e-8cce-29120aba1813\") " pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.364167 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd2r2\" (UniqueName: \"kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2\") pod \"auto-csr-approver-29564268-n8x84\" (UID: \"99b5bd2a-4f3c-442e-8cce-29120aba1813\") " pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.389450 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd2r2\" (UniqueName: \"kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2\") pod \"auto-csr-approver-29564268-n8x84\" (UID: \"99b5bd2a-4f3c-442e-8cce-29120aba1813\") " pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.477730 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:00 crc kubenswrapper[4939]: I0318 17:48:00.928170 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564268-n8x84"] Mar 18 17:48:01 crc kubenswrapper[4939]: I0318 17:48:01.370526 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564268-n8x84" event={"ID":"99b5bd2a-4f3c-442e-8cce-29120aba1813","Type":"ContainerStarted","Data":"d3a105b86750a8cf2e423613909cd8088351b98e7c236b77281f6bd0ffeecbaf"} Mar 18 17:48:02 crc kubenswrapper[4939]: I0318 17:48:02.389273 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564268-n8x84" event={"ID":"99b5bd2a-4f3c-442e-8cce-29120aba1813","Type":"ContainerStarted","Data":"dfba74c06f7d3fafe531bcaf7224e340bd2475eb8c4ac8ff59c491597914aa0f"} Mar 18 17:48:02 crc kubenswrapper[4939]: I0318 17:48:02.411726 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564268-n8x84" podStartSLOduration=1.451405592 podStartE2EDuration="2.411705545s" podCreationTimestamp="2026-03-18 17:48:00 +0000 UTC" firstStartedPulling="2026-03-18 17:48:00.93066033 +0000 UTC m=+7845.529847951" lastFinishedPulling="2026-03-18 17:48:01.890960283 +0000 UTC m=+7846.490147904" observedRunningTime="2026-03-18 17:48:02.403846319 +0000 UTC m=+7847.003033950" watchObservedRunningTime="2026-03-18 17:48:02.411705545 +0000 UTC m=+7847.010893186" Mar 18 17:48:03 crc kubenswrapper[4939]: I0318 17:48:03.407336 4939 generic.go:334] "Generic (PLEG): container finished" podID="99b5bd2a-4f3c-442e-8cce-29120aba1813" containerID="dfba74c06f7d3fafe531bcaf7224e340bd2475eb8c4ac8ff59c491597914aa0f" exitCode=0 Mar 18 17:48:03 crc kubenswrapper[4939]: I0318 17:48:03.407436 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564268-n8x84" event={"ID":"99b5bd2a-4f3c-442e-8cce-29120aba1813","Type":"ContainerDied","Data":"dfba74c06f7d3fafe531bcaf7224e340bd2475eb8c4ac8ff59c491597914aa0f"} Mar 18 17:48:04 crc kubenswrapper[4939]: I0318 17:48:04.784326 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:04 crc kubenswrapper[4939]: I0318 17:48:04.868586 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd2r2\" (UniqueName: \"kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2\") pod \"99b5bd2a-4f3c-442e-8cce-29120aba1813\" (UID: \"99b5bd2a-4f3c-442e-8cce-29120aba1813\") " Mar 18 17:48:04 crc kubenswrapper[4939]: I0318 17:48:04.874785 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2" (OuterVolumeSpecName: "kube-api-access-qd2r2") pod "99b5bd2a-4f3c-442e-8cce-29120aba1813" (UID: "99b5bd2a-4f3c-442e-8cce-29120aba1813"). InnerVolumeSpecName "kube-api-access-qd2r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:48:04 crc kubenswrapper[4939]: I0318 17:48:04.971388 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd2r2\" (UniqueName: \"kubernetes.io/projected/99b5bd2a-4f3c-442e-8cce-29120aba1813-kube-api-access-qd2r2\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:05 crc kubenswrapper[4939]: I0318 17:48:05.438137 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564268-n8x84" event={"ID":"99b5bd2a-4f3c-442e-8cce-29120aba1813","Type":"ContainerDied","Data":"d3a105b86750a8cf2e423613909cd8088351b98e7c236b77281f6bd0ffeecbaf"} Mar 18 17:48:05 crc kubenswrapper[4939]: I0318 17:48:05.438183 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a105b86750a8cf2e423613909cd8088351b98e7c236b77281f6bd0ffeecbaf" Mar 18 17:48:05 crc kubenswrapper[4939]: I0318 17:48:05.438200 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564268-n8x84" Mar 18 17:48:05 crc kubenswrapper[4939]: I0318 17:48:05.482621 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564262-4z69m"] Mar 18 17:48:05 crc kubenswrapper[4939]: I0318 17:48:05.494902 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564262-4z69m"] Mar 18 17:48:06 crc kubenswrapper[4939]: I0318 17:48:06.161124 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560019e0-f873-4da0-ac07-fdb41015f2eb" path="/var/lib/kubelet/pods/560019e0-f873-4da0-ac07-fdb41015f2eb/volumes" Mar 18 17:48:11 crc kubenswrapper[4939]: I0318 17:48:11.019549 4939 scope.go:117] "RemoveContainer" containerID="3ba8bd266ee21e740a289b27fc54d71b684cf2af3010455be2b89c6e203aceed" Mar 18 17:48:16 crc kubenswrapper[4939]: I0318 17:48:16.546717 4939 generic.go:334] "Generic (PLEG): container finished" podID="10576dfe-22f5-4937-bd11-149ca982c4f2" containerID="8b59aa919c8b493e7e96924efb6186a6bb1f1875a805a957effcd7ba60b51292" exitCode=0 Mar 18 17:48:16 crc kubenswrapper[4939]: I0318 17:48:16.546832 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" event={"ID":"10576dfe-22f5-4937-bd11-149ca982c4f2","Type":"ContainerDied","Data":"8b59aa919c8b493e7e96924efb6186a6bb1f1875a805a957effcd7ba60b51292"} Mar 18 17:48:17 crc kubenswrapper[4939]: I0318 17:48:17.970940 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.102685 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.102765 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.102873 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.102960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.103012 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.103116 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.103147 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjr62\" (UniqueName: \"kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62\") pod \"10576dfe-22f5-4937-bd11-149ca982c4f2\" (UID: \"10576dfe-22f5-4937-bd11-149ca982c4f2\") " Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.108671 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62" (OuterVolumeSpecName: "kube-api-access-rjr62") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "kube-api-access-rjr62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.110698 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph" (OuterVolumeSpecName: "ceph") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.110937 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.149884 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.153837 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.155798 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.159296 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory" (OuterVolumeSpecName: "inventory") pod "10576dfe-22f5-4937-bd11-149ca982c4f2" (UID: "10576dfe-22f5-4937-bd11-149ca982c4f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.207896 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.208206 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.208332 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjr62\" (UniqueName: \"kubernetes.io/projected/10576dfe-22f5-4937-bd11-149ca982c4f2-kube-api-access-rjr62\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.208489 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.208828 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.208953 4939 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.209486 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10576dfe-22f5-4937-bd11-149ca982c4f2-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.566862 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" event={"ID":"10576dfe-22f5-4937-bd11-149ca982c4f2","Type":"ContainerDied","Data":"97e51595990793fd89d1215b2876f446ee244d0056c12a791af556941635f59d"} Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.566902 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e51595990793fd89d1215b2876f446ee244d0056c12a791af556941635f59d" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.567034 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-26rfs" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.722239 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-2rtf4"] Mar 18 17:48:18 crc kubenswrapper[4939]: E0318 17:48:18.723188 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10576dfe-22f5-4937-bd11-149ca982c4f2" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.723348 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="10576dfe-22f5-4937-bd11-149ca982c4f2" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 17:48:18 crc kubenswrapper[4939]: E0318 17:48:18.723448 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b5bd2a-4f3c-442e-8cce-29120aba1813" containerName="oc" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.723561 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b5bd2a-4f3c-442e-8cce-29120aba1813" containerName="oc" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.723941 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="10576dfe-22f5-4937-bd11-149ca982c4f2" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.724080 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b5bd2a-4f3c-442e-8cce-29120aba1813" containerName="oc" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.725120 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.728438 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.728602 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.728742 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.728968 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.731678 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.746209 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-2rtf4"] Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.820867 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zp2\" (UniqueName: \"kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.820934 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.821038 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.821145 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.821177 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.821207 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.922916 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zp2\" (UniqueName: \"kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.922972 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.923085 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.923205 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.923236 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.923267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.927756 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.929371 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.929391 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.929561 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.941923 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:18 crc kubenswrapper[4939]: I0318 17:48:18.950226 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zp2\" (UniqueName: \"kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2\") pod \"libvirt-openstack-openstack-cell1-2rtf4\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:19 crc kubenswrapper[4939]: I0318 17:48:19.050025 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:48:19 crc kubenswrapper[4939]: I0318 17:48:19.601832 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-2rtf4"] Mar 18 17:48:19 crc kubenswrapper[4939]: W0318 17:48:19.604051 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e250960_80eb_4d0e_b653_d1a794760ac8.slice/crio-c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c WatchSource:0}: Error finding container c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c: Status 404 returned error can't find the container with id c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c Mar 18 17:48:20 crc kubenswrapper[4939]: I0318 17:48:20.589166 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" event={"ID":"8e250960-80eb-4d0e-b653-d1a794760ac8","Type":"ContainerStarted","Data":"057af224c31611429af836b9fa0ac8155525f6c7e8e679c25fb6936427516ca8"} Mar 18 17:48:20 crc kubenswrapper[4939]: I0318 17:48:20.589471 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" event={"ID":"8e250960-80eb-4d0e-b653-d1a794760ac8","Type":"ContainerStarted","Data":"c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c"} Mar 18 17:48:20 crc kubenswrapper[4939]: I0318 17:48:20.610864 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" podStartSLOduration=2.423530262 podStartE2EDuration="2.61084526s" podCreationTimestamp="2026-03-18 17:48:18 +0000 UTC" firstStartedPulling="2026-03-18 17:48:19.606365084 +0000 UTC m=+7864.205552705" lastFinishedPulling="2026-03-18 17:48:19.793680082 +0000 UTC m=+7864.392867703" observedRunningTime="2026-03-18 17:48:20.609715966 +0000 UTC m=+7865.208903587" watchObservedRunningTime="2026-03-18 17:48:20.61084526 +0000 UTC m=+7865.210032881" Mar 18 17:48:23 crc kubenswrapper[4939]: I0318 17:48:23.687049 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:48:23 crc kubenswrapper[4939]: I0318 17:48:23.687360 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:48:23 crc kubenswrapper[4939]: I0318 17:48:23.687413 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:48:23 crc kubenswrapper[4939]: I0318 17:48:23.688359 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:48:23 crc kubenswrapper[4939]: I0318 17:48:23.688425 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343" gracePeriod=600 Mar 18 17:48:24 crc kubenswrapper[4939]: I0318 17:48:24.638622 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343" exitCode=0 Mar 18 17:48:24 crc kubenswrapper[4939]: I0318 17:48:24.638686 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343"} Mar 18 17:48:24 crc kubenswrapper[4939]: I0318 17:48:24.639234 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b"} Mar 18 17:48:24 crc kubenswrapper[4939]: I0318 17:48:24.639259 4939 scope.go:117] "RemoveContainer" containerID="ad1ed776e6433552d4f0b8ffa287fac34e13209a715ded851147e8010e92ce73" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.150655 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564270-dfhhs"] Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.155622 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.157401 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.159313 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.159627 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.161365 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564270-dfhhs"] Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.286609 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvj8\" (UniqueName: \"kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8\") pod \"auto-csr-approver-29564270-dfhhs\" (UID: \"e7b8c58a-056a-4b51-9e1d-3432200bf421\") " pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.389664 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvj8\" (UniqueName: \"kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8\") pod \"auto-csr-approver-29564270-dfhhs\" (UID: \"e7b8c58a-056a-4b51-9e1d-3432200bf421\") " pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.418229 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvj8\" (UniqueName: \"kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8\") pod \"auto-csr-approver-29564270-dfhhs\" (UID: \"e7b8c58a-056a-4b51-9e1d-3432200bf421\") " pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:00 crc kubenswrapper[4939]: I0318 17:50:00.505000 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:01 crc kubenswrapper[4939]: I0318 17:50:01.087640 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564270-dfhhs"] Mar 18 17:50:01 crc kubenswrapper[4939]: I0318 17:50:01.090707 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:50:01 crc kubenswrapper[4939]: I0318 17:50:01.683205 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" event={"ID":"e7b8c58a-056a-4b51-9e1d-3432200bf421","Type":"ContainerStarted","Data":"163f86a50b36e93d06234eab6faa1cf57bdb189941b020952727e4a57a85c6c1"} Mar 18 17:50:03 crc kubenswrapper[4939]: I0318 17:50:03.705881 4939 generic.go:334] "Generic (PLEG): container finished" podID="e7b8c58a-056a-4b51-9e1d-3432200bf421" containerID="9d41cd734260aa26659596d8388d9c20e8c28c3b7d1a584e9ede9cd73a3f9fec" exitCode=0 Mar 18 17:50:03 crc kubenswrapper[4939]: I0318 17:50:03.705969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" event={"ID":"e7b8c58a-056a-4b51-9e1d-3432200bf421","Type":"ContainerDied","Data":"9d41cd734260aa26659596d8388d9c20e8c28c3b7d1a584e9ede9cd73a3f9fec"} Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.140316 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.307951 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btvj8\" (UniqueName: \"kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8\") pod \"e7b8c58a-056a-4b51-9e1d-3432200bf421\" (UID: \"e7b8c58a-056a-4b51-9e1d-3432200bf421\") " Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.314808 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8" (OuterVolumeSpecName: "kube-api-access-btvj8") pod "e7b8c58a-056a-4b51-9e1d-3432200bf421" (UID: "e7b8c58a-056a-4b51-9e1d-3432200bf421"). InnerVolumeSpecName "kube-api-access-btvj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.411988 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btvj8\" (UniqueName: \"kubernetes.io/projected/e7b8c58a-056a-4b51-9e1d-3432200bf421-kube-api-access-btvj8\") on node \"crc\" DevicePath \"\"" Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.737730 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" event={"ID":"e7b8c58a-056a-4b51-9e1d-3432200bf421","Type":"ContainerDied","Data":"163f86a50b36e93d06234eab6faa1cf57bdb189941b020952727e4a57a85c6c1"} Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.737776 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="163f86a50b36e93d06234eab6faa1cf57bdb189941b020952727e4a57a85c6c1" Mar 18 17:50:05 crc kubenswrapper[4939]: I0318 17:50:05.737873 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564270-dfhhs" Mar 18 17:50:06 crc kubenswrapper[4939]: I0318 17:50:06.227403 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564264-4jh8p"] Mar 18 17:50:06 crc kubenswrapper[4939]: I0318 17:50:06.239311 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564264-4jh8p"] Mar 18 17:50:08 crc kubenswrapper[4939]: I0318 17:50:08.145090 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75829c57-7661-4729-9d85-6880e9c590d9" path="/var/lib/kubelet/pods/75829c57-7661-4729-9d85-6880e9c590d9/volumes" Mar 18 17:50:11 crc kubenswrapper[4939]: I0318 17:50:11.114919 4939 scope.go:117] "RemoveContainer" containerID="32254f0846e0726affea67279b11c25d4d06a36e27a536c8147d49d156aed5e3" Mar 18 17:50:53 crc kubenswrapper[4939]: I0318 17:50:53.687142 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:50:53 crc kubenswrapper[4939]: I0318 17:50:53.687754 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:51:23 crc kubenswrapper[4939]: I0318 17:51:23.687799 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:51:23 crc kubenswrapper[4939]: I0318 17:51:23.689502 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.687081 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.687562 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.687610 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.688496 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.688570 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" gracePeriod=600 Mar 18 17:51:53 crc kubenswrapper[4939]: E0318 17:51:53.819114 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.943562 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" exitCode=0 Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.943653 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b"} Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.943918 4939 scope.go:117] "RemoveContainer" containerID="19940e160daf70da9700a3171ce544b68755543ac79e38da0645bfa539c93343" Mar 18 17:51:53 crc kubenswrapper[4939]: I0318 17:51:53.944768 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:51:53 crc kubenswrapper[4939]: E0318 17:51:53.945148 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.146930 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564272-c9hhb"] Mar 18 17:52:00 crc kubenswrapper[4939]: E0318 17:52:00.147835 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b8c58a-056a-4b51-9e1d-3432200bf421" containerName="oc" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.147850 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b8c58a-056a-4b51-9e1d-3432200bf421" containerName="oc" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.148129 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b8c58a-056a-4b51-9e1d-3432200bf421" containerName="oc" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.149058 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.151253 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.151355 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.151392 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.160332 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564272-c9hhb"] Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.258879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkkjt\" (UniqueName: \"kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt\") pod \"auto-csr-approver-29564272-c9hhb\" (UID: \"8397aab3-4c81-4083-a0a6-2538d61ddd42\") " pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.361380 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkkjt\" (UniqueName: \"kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt\") pod \"auto-csr-approver-29564272-c9hhb\" (UID: \"8397aab3-4c81-4083-a0a6-2538d61ddd42\") " pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.381078 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkkjt\" (UniqueName: \"kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt\") pod \"auto-csr-approver-29564272-c9hhb\" (UID: \"8397aab3-4c81-4083-a0a6-2538d61ddd42\") " pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.471332 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:00 crc kubenswrapper[4939]: I0318 17:52:00.990762 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564272-c9hhb"] Mar 18 17:52:00 crc kubenswrapper[4939]: W0318 17:52:00.993874 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397aab3_4c81_4083_a0a6_2538d61ddd42.slice/crio-40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5 WatchSource:0}: Error finding container 40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5: Status 404 returned error can't find the container with id 40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5 Mar 18 17:52:01 crc kubenswrapper[4939]: I0318 17:52:01.012927 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" event={"ID":"8397aab3-4c81-4083-a0a6-2538d61ddd42","Type":"ContainerStarted","Data":"40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5"} Mar 18 17:52:03 crc kubenswrapper[4939]: I0318 17:52:03.035526 4939 generic.go:334] "Generic (PLEG): container finished" podID="8397aab3-4c81-4083-a0a6-2538d61ddd42" containerID="eaec9121ab061ae9b94f7676621a01d4ad4a517ef40d04841949aa53c64b76f5" exitCode=0 Mar 18 17:52:03 crc kubenswrapper[4939]: I0318 17:52:03.036121 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" event={"ID":"8397aab3-4c81-4083-a0a6-2538d61ddd42","Type":"ContainerDied","Data":"eaec9121ab061ae9b94f7676621a01d4ad4a517ef40d04841949aa53c64b76f5"} Mar 18 17:52:04 crc kubenswrapper[4939]: I0318 17:52:04.410430 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:04 crc kubenswrapper[4939]: I0318 17:52:04.530231 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkkjt\" (UniqueName: \"kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt\") pod \"8397aab3-4c81-4083-a0a6-2538d61ddd42\" (UID: \"8397aab3-4c81-4083-a0a6-2538d61ddd42\") " Mar 18 17:52:04 crc kubenswrapper[4939]: I0318 17:52:04.535862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt" (OuterVolumeSpecName: "kube-api-access-bkkjt") pod "8397aab3-4c81-4083-a0a6-2538d61ddd42" (UID: "8397aab3-4c81-4083-a0a6-2538d61ddd42"). InnerVolumeSpecName "kube-api-access-bkkjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:52:04 crc kubenswrapper[4939]: I0318 17:52:04.632876 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkkjt\" (UniqueName: \"kubernetes.io/projected/8397aab3-4c81-4083-a0a6-2538d61ddd42-kube-api-access-bkkjt\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:05 crc kubenswrapper[4939]: I0318 17:52:05.062752 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" event={"ID":"8397aab3-4c81-4083-a0a6-2538d61ddd42","Type":"ContainerDied","Data":"40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5"} Mar 18 17:52:05 crc kubenswrapper[4939]: I0318 17:52:05.063162 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40dca361d015292af5a12e6be314310a731420fe8d48c5d5cfd4973281df61f5" Mar 18 17:52:05 crc kubenswrapper[4939]: I0318 17:52:05.062815 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564272-c9hhb" Mar 18 17:52:05 crc kubenswrapper[4939]: I0318 17:52:05.488817 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564266-n6pzl"] Mar 18 17:52:05 crc kubenswrapper[4939]: I0318 17:52:05.502767 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564266-n6pzl"] Mar 18 17:52:06 crc kubenswrapper[4939]: I0318 17:52:06.146911 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531" path="/var/lib/kubelet/pods/cc6770fc-3c0d-4ca7-a9c7-e3ca58af9531/volumes" Mar 18 17:52:08 crc kubenswrapper[4939]: I0318 17:52:08.133497 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:52:08 crc kubenswrapper[4939]: E0318 17:52:08.134916 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:52:11 crc kubenswrapper[4939]: I0318 17:52:11.204943 4939 scope.go:117] "RemoveContainer" containerID="92dcf0dcef797f1e709f39ae9d28139aae39464bbbf4a847f1297d28abbf8300" Mar 18 17:52:19 crc kubenswrapper[4939]: I0318 17:52:19.133342 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:52:19 crc kubenswrapper[4939]: E0318 17:52:19.134095 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:52:32 crc kubenswrapper[4939]: I0318 17:52:32.133410 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:52:32 crc kubenswrapper[4939]: E0318 17:52:32.134243 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:52:46 crc kubenswrapper[4939]: I0318 17:52:46.145842 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:52:46 crc kubenswrapper[4939]: E0318 17:52:46.146981 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:52:51 crc kubenswrapper[4939]: I0318 17:52:51.599068 4939 generic.go:334] "Generic (PLEG): container finished" podID="8e250960-80eb-4d0e-b653-d1a794760ac8" containerID="057af224c31611429af836b9fa0ac8155525f6c7e8e679c25fb6936427516ca8" exitCode=0 Mar 18 17:52:51 crc kubenswrapper[4939]: I0318 17:52:51.599174 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" event={"ID":"8e250960-80eb-4d0e-b653-d1a794760ac8","Type":"ContainerDied","Data":"057af224c31611429af836b9fa0ac8155525f6c7e8e679c25fb6936427516ca8"} Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.114062 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269232 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269404 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zp2\" (UniqueName: \"kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269481 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269561 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269589 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.269636 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph\") pod \"8e250960-80eb-4d0e-b653-d1a794760ac8\" (UID: \"8e250960-80eb-4d0e-b653-d1a794760ac8\") " Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.274734 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.280417 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2" (OuterVolumeSpecName: "kube-api-access-g8zp2") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "kube-api-access-g8zp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.281087 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph" (OuterVolumeSpecName: "ceph") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.301711 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.304582 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory" (OuterVolumeSpecName: "inventory") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.309672 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "8e250960-80eb-4d0e-b653-d1a794760ac8" (UID: "8e250960-80eb-4d0e-b653-d1a794760ac8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.372896 4939 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.373145 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zp2\" (UniqueName: \"kubernetes.io/projected/8e250960-80eb-4d0e-b653-d1a794760ac8-kube-api-access-g8zp2\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.373160 4939 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.373172 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.373236 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.373248 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e250960-80eb-4d0e-b653-d1a794760ac8-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.623122 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" event={"ID":"8e250960-80eb-4d0e-b653-d1a794760ac8","Type":"ContainerDied","Data":"c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c"} Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.623168 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-2rtf4" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.623169 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c023fe8fb0c52f95a641666ded3ec376593ad47969d282e86b859febefd58b8c" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.733821 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-gdwjd"] Mar 18 17:52:53 crc kubenswrapper[4939]: E0318 17:52:53.734472 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e250960-80eb-4d0e-b653-d1a794760ac8" containerName="libvirt-openstack-openstack-cell1" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.734500 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e250960-80eb-4d0e-b653-d1a794760ac8" containerName="libvirt-openstack-openstack-cell1" Mar 18 17:52:53 crc kubenswrapper[4939]: E0318 17:52:53.734598 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8397aab3-4c81-4083-a0a6-2538d61ddd42" containerName="oc" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.734613 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8397aab3-4c81-4083-a0a6-2538d61ddd42" containerName="oc" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.735011 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8397aab3-4c81-4083-a0a6-2538d61ddd42" containerName="oc" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.735043 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e250960-80eb-4d0e-b653-d1a794760ac8" containerName="libvirt-openstack-openstack-cell1" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.736280 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745001 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745280 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745368 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745462 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745560 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745642 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.745712 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.748416 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-gdwjd"] Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.885809 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886126 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886243 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886378 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886484 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886607 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886714 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.886869 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.887050 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.887180 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xkdv\" (UniqueName: \"kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.887308 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.887403 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.887489 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989489 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989680 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989729 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xkdv\" (UniqueName: \"kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989768 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989803 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989827 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989915 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989939 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.989965 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.990000 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.990030 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.990055 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:53 crc kubenswrapper[4939]: I0318 17:52:53.990097 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.000765 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.001436 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.008818 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.008976 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.009583 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.009689 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.021207 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.021892 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.028133 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.028451 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xkdv\" (UniqueName: \"kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.052803 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.053650 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.054085 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-gdwjd\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.067093 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:52:54 crc kubenswrapper[4939]: I0318 17:52:54.659669 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-gdwjd"] Mar 18 17:52:54 crc kubenswrapper[4939]: W0318 17:52:54.662673 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3f5007_c88c_446a_9543_71fe870e43e6.slice/crio-c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d WatchSource:0}: Error finding container c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d: Status 404 returned error can't find the container with id c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d Mar 18 17:52:55 crc kubenswrapper[4939]: I0318 17:52:55.645360 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" event={"ID":"7a3f5007-c88c-446a-9543-71fe870e43e6","Type":"ContainerStarted","Data":"6b4c8d70517d87476dc872ec8297e2406d11e736a5e73610afe3603dc1dcbb97"} Mar 18 17:52:55 crc kubenswrapper[4939]: I0318 17:52:55.646015 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" event={"ID":"7a3f5007-c88c-446a-9543-71fe870e43e6","Type":"ContainerStarted","Data":"c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d"} Mar 18 17:52:55 crc kubenswrapper[4939]: I0318 17:52:55.669986 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" podStartSLOduration=2.442251962 podStartE2EDuration="2.669967736s" podCreationTimestamp="2026-03-18 17:52:53 +0000 UTC" firstStartedPulling="2026-03-18 17:52:54.665982267 +0000 UTC m=+8139.265169898" lastFinishedPulling="2026-03-18 17:52:54.893698021 +0000 UTC m=+8139.492885672" observedRunningTime="2026-03-18 17:52:55.662700523 +0000 UTC m=+8140.261888144" watchObservedRunningTime="2026-03-18 17:52:55.669967736 +0000 UTC m=+8140.269155357" Mar 18 17:52:59 crc kubenswrapper[4939]: I0318 17:52:59.133448 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:52:59 crc kubenswrapper[4939]: E0318 17:52:59.134130 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:53:11 crc kubenswrapper[4939]: I0318 17:53:11.133577 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:53:11 crc kubenswrapper[4939]: E0318 17:53:11.134394 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.191834 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.196097 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.234424 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.372971 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9lzc\" (UniqueName: \"kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.373693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.373857 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.476309 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9lzc\" (UniqueName: \"kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.476364 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.476390 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.476943 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.476993 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.496467 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9lzc\" (UniqueName: \"kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc\") pod \"redhat-operators-tpmcp\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:17 crc kubenswrapper[4939]: I0318 17:53:17.549484 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:18 crc kubenswrapper[4939]: I0318 17:53:18.061364 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:18 crc kubenswrapper[4939]: I0318 17:53:18.916381 4939 generic.go:334] "Generic (PLEG): container finished" podID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerID="95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d" exitCode=0 Mar 18 17:53:18 crc kubenswrapper[4939]: I0318 17:53:18.916473 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerDied","Data":"95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d"} Mar 18 17:53:18 crc kubenswrapper[4939]: I0318 17:53:18.917750 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerStarted","Data":"c1315b1b9794b73e5f11972869d596a97a22727fd736720fc05bf2f13d8fc737"} Mar 18 17:53:19 crc kubenswrapper[4939]: I0318 17:53:19.930861 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerStarted","Data":"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e"} Mar 18 17:53:20 crc kubenswrapper[4939]: I0318 17:53:20.945068 4939 generic.go:334] "Generic (PLEG): container finished" podID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerID="7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e" exitCode=0 Mar 18 17:53:20 crc kubenswrapper[4939]: I0318 17:53:20.945153 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerDied","Data":"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e"} Mar 18 17:53:21 crc kubenswrapper[4939]: I0318 17:53:21.958804 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerStarted","Data":"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5"} Mar 18 17:53:21 crc kubenswrapper[4939]: I0318 17:53:21.986790 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpmcp" podStartSLOduration=2.302681286 podStartE2EDuration="4.986766758s" podCreationTimestamp="2026-03-18 17:53:17 +0000 UTC" firstStartedPulling="2026-03-18 17:53:18.91847386 +0000 UTC m=+8163.517661481" lastFinishedPulling="2026-03-18 17:53:21.602559322 +0000 UTC m=+8166.201746953" observedRunningTime="2026-03-18 17:53:21.976978511 +0000 UTC m=+8166.576166182" watchObservedRunningTime="2026-03-18 17:53:21.986766758 +0000 UTC m=+8166.585954389" Mar 18 17:53:24 crc kubenswrapper[4939]: I0318 17:53:24.134968 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:53:24 crc kubenswrapper[4939]: E0318 17:53:24.137441 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:53:27 crc kubenswrapper[4939]: I0318 17:53:27.550593 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:27 crc kubenswrapper[4939]: I0318 17:53:27.551064 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:28 crc kubenswrapper[4939]: I0318 17:53:28.613344 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpmcp" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="registry-server" probeResult="failure" output=< Mar 18 17:53:28 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:53:28 crc kubenswrapper[4939]: > Mar 18 17:53:36 crc kubenswrapper[4939]: I0318 17:53:36.144934 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:53:36 crc kubenswrapper[4939]: E0318 17:53:36.146273 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:53:37 crc kubenswrapper[4939]: I0318 17:53:37.634857 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:37 crc kubenswrapper[4939]: I0318 17:53:37.723849 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:37 crc kubenswrapper[4939]: I0318 17:53:37.894215 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.151691 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpmcp" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="registry-server" containerID="cri-o://e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5" gracePeriod=2 Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.631596 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.731231 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities\") pod \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.731417 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9lzc\" (UniqueName: \"kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc\") pod \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.731562 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content\") pod \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\" (UID: \"677f65ce-dd3b-4efc-bf80-e9ae849caa5c\") " Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.732952 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities" (OuterVolumeSpecName: "utilities") pod "677f65ce-dd3b-4efc-bf80-e9ae849caa5c" (UID: "677f65ce-dd3b-4efc-bf80-e9ae849caa5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.740419 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc" (OuterVolumeSpecName: "kube-api-access-l9lzc") pod "677f65ce-dd3b-4efc-bf80-e9ae849caa5c" (UID: "677f65ce-dd3b-4efc-bf80-e9ae849caa5c"). InnerVolumeSpecName "kube-api-access-l9lzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.834552 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.834778 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9lzc\" (UniqueName: \"kubernetes.io/projected/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-kube-api-access-l9lzc\") on node \"crc\" DevicePath \"\"" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.864912 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "677f65ce-dd3b-4efc-bf80-e9ae849caa5c" (UID: "677f65ce-dd3b-4efc-bf80-e9ae849caa5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:53:39 crc kubenswrapper[4939]: I0318 17:53:39.937414 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677f65ce-dd3b-4efc-bf80-e9ae849caa5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.165252 4939 generic.go:334] "Generic (PLEG): container finished" podID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerID="e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5" exitCode=0 Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.165366 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpmcp" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.165390 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerDied","Data":"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5"} Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.165674 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpmcp" event={"ID":"677f65ce-dd3b-4efc-bf80-e9ae849caa5c","Type":"ContainerDied","Data":"c1315b1b9794b73e5f11972869d596a97a22727fd736720fc05bf2f13d8fc737"} Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.165693 4939 scope.go:117] "RemoveContainer" containerID="e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.197700 4939 scope.go:117] "RemoveContainer" containerID="7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.204216 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.217439 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpmcp"] Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.234058 4939 scope.go:117] "RemoveContainer" containerID="95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.272913 4939 scope.go:117] "RemoveContainer" containerID="e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5" Mar 18 17:53:40 crc kubenswrapper[4939]: E0318 17:53:40.273409 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5\": container with ID starting with e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5 not found: ID does not exist" containerID="e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.273598 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5"} err="failed to get container status \"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5\": rpc error: code = NotFound desc = could not find container \"e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5\": container with ID starting with e727396e2a71e2385e540cc6e60515431c9ad71845103a74d7bd241ce46b39e5 not found: ID does not exist" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.273713 4939 scope.go:117] "RemoveContainer" containerID="7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e" Mar 18 17:53:40 crc kubenswrapper[4939]: E0318 17:53:40.274286 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e\": container with ID starting with 7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e not found: ID does not exist" containerID="7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.274534 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e"} err="failed to get container status \"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e\": rpc error: code = NotFound desc = could not find container \"7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e\": container with ID starting with 7edc1d6f142cabfe66df2a3bbcd29d96dc37de76a462e565972b29241918576e not found: ID does not exist" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.274628 4939 scope.go:117] "RemoveContainer" containerID="95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d" Mar 18 17:53:40 crc kubenswrapper[4939]: E0318 17:53:40.275019 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d\": container with ID starting with 95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d not found: ID does not exist" containerID="95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d" Mar 18 17:53:40 crc kubenswrapper[4939]: I0318 17:53:40.275125 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d"} err="failed to get container status \"95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d\": rpc error: code = NotFound desc = could not find container \"95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d\": container with ID starting with 95d6d1994c6aa86cf57b5374bb52b252f464c4db105caa8cd6d338d72f77644d not found: ID does not exist" Mar 18 17:53:42 crc kubenswrapper[4939]: I0318 17:53:42.158418 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" path="/var/lib/kubelet/pods/677f65ce-dd3b-4efc-bf80-e9ae849caa5c/volumes" Mar 18 17:53:47 crc kubenswrapper[4939]: I0318 17:53:47.133765 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:53:47 crc kubenswrapper[4939]: E0318 17:53:47.136894 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.177310 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564274-mtz6c"] Mar 18 17:54:00 crc kubenswrapper[4939]: E0318 17:54:00.178552 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="registry-server" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.178568 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="registry-server" Mar 18 17:54:00 crc kubenswrapper[4939]: E0318 17:54:00.178591 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="extract-utilities" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.178600 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="extract-utilities" Mar 18 17:54:00 crc kubenswrapper[4939]: E0318 17:54:00.178620 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="extract-content" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.178628 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="extract-content" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.178882 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="677f65ce-dd3b-4efc-bf80-e9ae849caa5c" containerName="registry-server" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.179813 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.182175 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.182498 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.184954 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.189521 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564274-mtz6c"] Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.233022 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmw8s\" (UniqueName: \"kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s\") pod \"auto-csr-approver-29564274-mtz6c\" (UID: \"2ccbf094-692b-4efc-89b1-8091f16c6521\") " pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.335662 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmw8s\" (UniqueName: \"kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s\") pod \"auto-csr-approver-29564274-mtz6c\" (UID: \"2ccbf094-692b-4efc-89b1-8091f16c6521\") " pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.368752 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmw8s\" (UniqueName: \"kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s\") pod \"auto-csr-approver-29564274-mtz6c\" (UID: \"2ccbf094-692b-4efc-89b1-8091f16c6521\") " pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.507210 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:00 crc kubenswrapper[4939]: I0318 17:54:00.986369 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564274-mtz6c"] Mar 18 17:54:01 crc kubenswrapper[4939]: I0318 17:54:01.133723 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:54:01 crc kubenswrapper[4939]: E0318 17:54:01.134014 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:54:01 crc kubenswrapper[4939]: I0318 17:54:01.450555 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" event={"ID":"2ccbf094-692b-4efc-89b1-8091f16c6521","Type":"ContainerStarted","Data":"cf2674abc536add2a5f28b5c278dcbcbfe1c02d34d0b1c1156a1e754a98815c4"} Mar 18 17:54:03 crc kubenswrapper[4939]: I0318 17:54:03.469826 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" event={"ID":"2ccbf094-692b-4efc-89b1-8091f16c6521","Type":"ContainerStarted","Data":"6f8c7830d82224d3bc838ad8bd9f3755739b29f39937552280d6b48ae71adccb"} Mar 18 17:54:03 crc kubenswrapper[4939]: I0318 17:54:03.491575 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" podStartSLOduration=1.336870971 podStartE2EDuration="3.491548263s" podCreationTimestamp="2026-03-18 17:54:00 +0000 UTC" firstStartedPulling="2026-03-18 17:54:01.000664873 +0000 UTC m=+8205.599852494" lastFinishedPulling="2026-03-18 17:54:03.155342165 +0000 UTC m=+8207.754529786" observedRunningTime="2026-03-18 17:54:03.482729624 +0000 UTC m=+8208.081917245" watchObservedRunningTime="2026-03-18 17:54:03.491548263 +0000 UTC m=+8208.090735904" Mar 18 17:54:04 crc kubenswrapper[4939]: I0318 17:54:04.480159 4939 generic.go:334] "Generic (PLEG): container finished" podID="2ccbf094-692b-4efc-89b1-8091f16c6521" containerID="6f8c7830d82224d3bc838ad8bd9f3755739b29f39937552280d6b48ae71adccb" exitCode=0 Mar 18 17:54:04 crc kubenswrapper[4939]: I0318 17:54:04.480229 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" event={"ID":"2ccbf094-692b-4efc-89b1-8091f16c6521","Type":"ContainerDied","Data":"6f8c7830d82224d3bc838ad8bd9f3755739b29f39937552280d6b48ae71adccb"} Mar 18 17:54:05 crc kubenswrapper[4939]: I0318 17:54:05.912959 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.087877 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmw8s\" (UniqueName: \"kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s\") pod \"2ccbf094-692b-4efc-89b1-8091f16c6521\" (UID: \"2ccbf094-692b-4efc-89b1-8091f16c6521\") " Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.093963 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s" (OuterVolumeSpecName: "kube-api-access-gmw8s") pod "2ccbf094-692b-4efc-89b1-8091f16c6521" (UID: "2ccbf094-692b-4efc-89b1-8091f16c6521"). InnerVolumeSpecName "kube-api-access-gmw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.191429 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmw8s\" (UniqueName: \"kubernetes.io/projected/2ccbf094-692b-4efc-89b1-8091f16c6521-kube-api-access-gmw8s\") on node \"crc\" DevicePath \"\"" Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.511339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" event={"ID":"2ccbf094-692b-4efc-89b1-8091f16c6521","Type":"ContainerDied","Data":"cf2674abc536add2a5f28b5c278dcbcbfe1c02d34d0b1c1156a1e754a98815c4"} Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.511393 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf2674abc536add2a5f28b5c278dcbcbfe1c02d34d0b1c1156a1e754a98815c4" Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.511450 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564274-mtz6c" Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.580809 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564268-n8x84"] Mar 18 17:54:06 crc kubenswrapper[4939]: I0318 17:54:06.592846 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564268-n8x84"] Mar 18 17:54:08 crc kubenswrapper[4939]: I0318 17:54:08.149992 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b5bd2a-4f3c-442e-8cce-29120aba1813" path="/var/lib/kubelet/pods/99b5bd2a-4f3c-442e-8cce-29120aba1813/volumes" Mar 18 17:54:11 crc kubenswrapper[4939]: I0318 17:54:11.331003 4939 scope.go:117] "RemoveContainer" containerID="dfba74c06f7d3fafe531bcaf7224e340bd2475eb8c4ac8ff59c491597914aa0f" Mar 18 17:54:15 crc kubenswrapper[4939]: I0318 17:54:15.133316 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:54:15 crc kubenswrapper[4939]: E0318 17:54:15.134214 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:54:28 crc kubenswrapper[4939]: I0318 17:54:28.133818 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:54:28 crc kubenswrapper[4939]: E0318 17:54:28.134447 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:54:43 crc kubenswrapper[4939]: I0318 17:54:43.133478 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:54:43 crc kubenswrapper[4939]: E0318 17:54:43.134678 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:54:56 crc kubenswrapper[4939]: I0318 17:54:56.145555 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:54:56 crc kubenswrapper[4939]: E0318 17:54:56.147109 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:55:07 crc kubenswrapper[4939]: I0318 17:55:07.133851 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:55:07 crc kubenswrapper[4939]: E0318 17:55:07.137077 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:55:22 crc kubenswrapper[4939]: I0318 17:55:22.134398 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:55:22 crc kubenswrapper[4939]: E0318 17:55:22.135415 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:55:36 crc kubenswrapper[4939]: I0318 17:55:36.151609 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:55:36 crc kubenswrapper[4939]: E0318 17:55:36.153157 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:55:50 crc kubenswrapper[4939]: I0318 17:55:50.205878 4939 generic.go:334] "Generic (PLEG): container finished" podID="7a3f5007-c88c-446a-9543-71fe870e43e6" containerID="6b4c8d70517d87476dc872ec8297e2406d11e736a5e73610afe3603dc1dcbb97" exitCode=0 Mar 18 17:55:50 crc kubenswrapper[4939]: I0318 17:55:50.205957 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" event={"ID":"7a3f5007-c88c-446a-9543-71fe870e43e6","Type":"ContainerDied","Data":"6b4c8d70517d87476dc872ec8297e2406d11e736a5e73610afe3603dc1dcbb97"} Mar 18 17:55:51 crc kubenswrapper[4939]: I0318 17:55:51.133827 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:55:51 crc kubenswrapper[4939]: E0318 17:55:51.134761 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:55:51 crc kubenswrapper[4939]: I0318 17:55:51.865529 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045441 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045527 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045569 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045614 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045758 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045802 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045845 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xkdv\" (UniqueName: \"kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045944 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.045970 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.046029 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.046069 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.046110 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.046144 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle\") pod \"7a3f5007-c88c-446a-9543-71fe870e43e6\" (UID: \"7a3f5007-c88c-446a-9543-71fe870e43e6\") " Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.052907 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.061460 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph" (OuterVolumeSpecName: "ceph") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.070989 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv" (OuterVolumeSpecName: "kube-api-access-2xkdv") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "kube-api-access-2xkdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.078959 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.084345 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.085597 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.087762 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.090482 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.096864 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.108049 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.110761 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory" (OuterVolumeSpecName: "inventory") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.112255 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.116232 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7a3f5007-c88c-446a-9543-71fe870e43e6" (UID: "7a3f5007-c88c-446a-9543-71fe870e43e6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153064 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153102 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xkdv\" (UniqueName: \"kubernetes.io/projected/7a3f5007-c88c-446a-9543-71fe870e43e6-kube-api-access-2xkdv\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153113 4939 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153133 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153142 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153151 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153161 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153172 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153182 4939 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153194 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153205 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153214 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7a3f5007-c88c-446a-9543-71fe870e43e6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.153224 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7a3f5007-c88c-446a-9543-71fe870e43e6-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.241620 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" event={"ID":"7a3f5007-c88c-446a-9543-71fe870e43e6","Type":"ContainerDied","Data":"c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d"} Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.242293 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c46d0a5e715a12a56f3e01dd4fdf6dcc43651ebcb181f4028f3328fe53f6d50d" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.241782 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-gdwjd" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.342804 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-5jhdk"] Mar 18 17:55:52 crc kubenswrapper[4939]: E0318 17:55:52.343246 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccbf094-692b-4efc-89b1-8091f16c6521" containerName="oc" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.343263 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccbf094-692b-4efc-89b1-8091f16c6521" containerName="oc" Mar 18 17:55:52 crc kubenswrapper[4939]: E0318 17:55:52.343303 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3f5007-c88c-446a-9543-71fe870e43e6" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.343310 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f5007-c88c-446a-9543-71fe870e43e6" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.343488 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccbf094-692b-4efc-89b1-8091f16c6521" containerName="oc" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.343548 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3f5007-c88c-446a-9543-71fe870e43e6" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.344231 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.346752 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.346780 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.346750 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.346935 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.350642 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.364278 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-5jhdk"] Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462673 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462717 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462777 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462819 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462879 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462904 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpbf\" (UniqueName: \"kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462923 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.462942 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.565970 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.566046 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.566569 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.566788 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.567017 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.567127 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpbf\" (UniqueName: \"kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.567251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.567859 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.571928 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.572779 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.572797 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.572958 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.574717 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.578915 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.584454 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.589469 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpbf\" (UniqueName: \"kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf\") pod \"telemetry-openstack-openstack-cell1-5jhdk\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:52 crc kubenswrapper[4939]: I0318 17:55:52.661825 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:55:53 crc kubenswrapper[4939]: I0318 17:55:53.055347 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-5jhdk"] Mar 18 17:55:53 crc kubenswrapper[4939]: I0318 17:55:53.068676 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:55:53 crc kubenswrapper[4939]: I0318 17:55:53.277044 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" event={"ID":"fb138a20-d324-472f-abff-090725f661d8","Type":"ContainerStarted","Data":"b9972ab95cf16c3b5c290962859eb7a8ff389378e871a78736e2e7fa116b6967"} Mar 18 17:55:54 crc kubenswrapper[4939]: I0318 17:55:54.295362 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" event={"ID":"fb138a20-d324-472f-abff-090725f661d8","Type":"ContainerStarted","Data":"b09183b9029122c5d328e1869812304b061777bd9445e63e02bc8dd018058806"} Mar 18 17:55:54 crc kubenswrapper[4939]: I0318 17:55:54.319663 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" podStartSLOduration=2.146554696 podStartE2EDuration="2.319637871s" podCreationTimestamp="2026-03-18 17:55:52 +0000 UTC" firstStartedPulling="2026-03-18 17:55:53.068170976 +0000 UTC m=+8317.667358637" lastFinishedPulling="2026-03-18 17:55:53.241254151 +0000 UTC m=+8317.840441812" observedRunningTime="2026-03-18 17:55:54.310737107 +0000 UTC m=+8318.909924738" watchObservedRunningTime="2026-03-18 17:55:54.319637871 +0000 UTC m=+8318.918825502" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.150438 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564276-8vrs5"] Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.152193 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564276-8vrs5"] Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.152266 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.154415 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.156368 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.156578 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.251094 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r\") pod \"auto-csr-approver-29564276-8vrs5\" (UID: \"dc96d74b-b0c3-4437-af3c-c030e9a79b15\") " pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.353226 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r\") pod \"auto-csr-approver-29564276-8vrs5\" (UID: \"dc96d74b-b0c3-4437-af3c-c030e9a79b15\") " pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.377574 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r\") pod \"auto-csr-approver-29564276-8vrs5\" (UID: \"dc96d74b-b0c3-4437-af3c-c030e9a79b15\") " pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:00 crc kubenswrapper[4939]: I0318 17:56:00.508057 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:01 crc kubenswrapper[4939]: I0318 17:56:01.037237 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564276-8vrs5"] Mar 18 17:56:01 crc kubenswrapper[4939]: I0318 17:56:01.402578 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" event={"ID":"dc96d74b-b0c3-4437-af3c-c030e9a79b15","Type":"ContainerStarted","Data":"b036d349c218182cf4ab6d68d679bc4196d12ac9a99bd77e66f844e36afb0b59"} Mar 18 17:56:02 crc kubenswrapper[4939]: I0318 17:56:02.136300 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:56:02 crc kubenswrapper[4939]: E0318 17:56:02.136624 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:56:02 crc kubenswrapper[4939]: I0318 17:56:02.413216 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" event={"ID":"dc96d74b-b0c3-4437-af3c-c030e9a79b15","Type":"ContainerStarted","Data":"4efde799316df0183894cb81489b1a2e142c6929c78811f7a0bd6f617e393b73"} Mar 18 17:56:02 crc kubenswrapper[4939]: I0318 17:56:02.456312 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" podStartSLOduration=1.507347599 podStartE2EDuration="2.456292754s" podCreationTimestamp="2026-03-18 17:56:00 +0000 UTC" firstStartedPulling="2026-03-18 17:56:01.029114149 +0000 UTC m=+8325.628301780" lastFinishedPulling="2026-03-18 17:56:01.978059274 +0000 UTC m=+8326.577246935" observedRunningTime="2026-03-18 17:56:02.442845191 +0000 UTC m=+8327.042032812" watchObservedRunningTime="2026-03-18 17:56:02.456292754 +0000 UTC m=+8327.055480375" Mar 18 17:56:03 crc kubenswrapper[4939]: I0318 17:56:03.423093 4939 generic.go:334] "Generic (PLEG): container finished" podID="dc96d74b-b0c3-4437-af3c-c030e9a79b15" containerID="4efde799316df0183894cb81489b1a2e142c6929c78811f7a0bd6f617e393b73" exitCode=0 Mar 18 17:56:03 crc kubenswrapper[4939]: I0318 17:56:03.423337 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" event={"ID":"dc96d74b-b0c3-4437-af3c-c030e9a79b15","Type":"ContainerDied","Data":"4efde799316df0183894cb81489b1a2e142c6929c78811f7a0bd6f617e393b73"} Mar 18 17:56:04 crc kubenswrapper[4939]: I0318 17:56:04.861960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:04 crc kubenswrapper[4939]: I0318 17:56:04.957028 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r\") pod \"dc96d74b-b0c3-4437-af3c-c030e9a79b15\" (UID: \"dc96d74b-b0c3-4437-af3c-c030e9a79b15\") " Mar 18 17:56:04 crc kubenswrapper[4939]: I0318 17:56:04.964840 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r" (OuterVolumeSpecName: "kube-api-access-tr27r") pod "dc96d74b-b0c3-4437-af3c-c030e9a79b15" (UID: "dc96d74b-b0c3-4437-af3c-c030e9a79b15"). InnerVolumeSpecName "kube-api-access-tr27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.060766 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/dc96d74b-b0c3-4437-af3c-c030e9a79b15-kube-api-access-tr27r\") on node \"crc\" DevicePath \"\"" Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.442798 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" event={"ID":"dc96d74b-b0c3-4437-af3c-c030e9a79b15","Type":"ContainerDied","Data":"b036d349c218182cf4ab6d68d679bc4196d12ac9a99bd77e66f844e36afb0b59"} Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.442842 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b036d349c218182cf4ab6d68d679bc4196d12ac9a99bd77e66f844e36afb0b59" Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.442897 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564276-8vrs5" Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.959965 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564270-dfhhs"] Mar 18 17:56:05 crc kubenswrapper[4939]: I0318 17:56:05.973299 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564270-dfhhs"] Mar 18 17:56:06 crc kubenswrapper[4939]: I0318 17:56:06.150999 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b8c58a-056a-4b51-9e1d-3432200bf421" path="/var/lib/kubelet/pods/e7b8c58a-056a-4b51-9e1d-3432200bf421/volumes" Mar 18 17:56:11 crc kubenswrapper[4939]: I0318 17:56:11.488703 4939 scope.go:117] "RemoveContainer" containerID="9d41cd734260aa26659596d8388d9c20e8c28c3b7d1a584e9ede9cd73a3f9fec" Mar 18 17:56:16 crc kubenswrapper[4939]: I0318 17:56:16.143137 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:56:16 crc kubenswrapper[4939]: E0318 17:56:16.144870 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:56:29 crc kubenswrapper[4939]: I0318 17:56:29.133380 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:56:29 crc kubenswrapper[4939]: E0318 17:56:29.134421 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:56:44 crc kubenswrapper[4939]: I0318 17:56:44.133685 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:56:44 crc kubenswrapper[4939]: E0318 17:56:44.134611 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 17:56:56 crc kubenswrapper[4939]: I0318 17:56:56.146755 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 17:56:57 crc kubenswrapper[4939]: I0318 17:56:57.079091 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a"} Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.164630 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564278-ffn99"] Mar 18 17:58:00 crc kubenswrapper[4939]: E0318 17:58:00.165844 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc96d74b-b0c3-4437-af3c-c030e9a79b15" containerName="oc" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.165861 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc96d74b-b0c3-4437-af3c-c030e9a79b15" containerName="oc" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.166192 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc96d74b-b0c3-4437-af3c-c030e9a79b15" containerName="oc" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.167124 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.170228 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.170566 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.174373 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.177161 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564278-ffn99"] Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.239865 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbt7\" (UniqueName: \"kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7\") pod \"auto-csr-approver-29564278-ffn99\" (UID: \"a6210c6a-418b-404a-8cf3-6e7c916c73db\") " pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.343335 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbt7\" (UniqueName: \"kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7\") pod \"auto-csr-approver-29564278-ffn99\" (UID: \"a6210c6a-418b-404a-8cf3-6e7c916c73db\") " pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.362848 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbt7\" (UniqueName: \"kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7\") pod \"auto-csr-approver-29564278-ffn99\" (UID: \"a6210c6a-418b-404a-8cf3-6e7c916c73db\") " pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:00 crc kubenswrapper[4939]: I0318 17:58:00.513610 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:01 crc kubenswrapper[4939]: I0318 17:58:01.031073 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564278-ffn99"] Mar 18 17:58:01 crc kubenswrapper[4939]: W0318 17:58:01.037977 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6210c6a_418b_404a_8cf3_6e7c916c73db.slice/crio-864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850 WatchSource:0}: Error finding container 864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850: Status 404 returned error can't find the container with id 864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850 Mar 18 17:58:01 crc kubenswrapper[4939]: I0318 17:58:01.840532 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564278-ffn99" event={"ID":"a6210c6a-418b-404a-8cf3-6e7c916c73db","Type":"ContainerStarted","Data":"864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850"} Mar 18 17:58:02 crc kubenswrapper[4939]: I0318 17:58:02.852919 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564278-ffn99" event={"ID":"a6210c6a-418b-404a-8cf3-6e7c916c73db","Type":"ContainerStarted","Data":"b54f93a3df000c260084afd90d0b6989108ccda487ae1cde601c11b204a28d11"} Mar 18 17:58:02 crc kubenswrapper[4939]: I0318 17:58:02.868971 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564278-ffn99" podStartSLOduration=1.48421688 podStartE2EDuration="2.868953708s" podCreationTimestamp="2026-03-18 17:58:00 +0000 UTC" firstStartedPulling="2026-03-18 17:58:01.041903873 +0000 UTC m=+8445.641091494" lastFinishedPulling="2026-03-18 17:58:02.426640681 +0000 UTC m=+8447.025828322" observedRunningTime="2026-03-18 17:58:02.866220931 +0000 UTC m=+8447.465408572" watchObservedRunningTime="2026-03-18 17:58:02.868953708 +0000 UTC m=+8447.468141339" Mar 18 17:58:03 crc kubenswrapper[4939]: I0318 17:58:03.863850 4939 generic.go:334] "Generic (PLEG): container finished" podID="a6210c6a-418b-404a-8cf3-6e7c916c73db" containerID="b54f93a3df000c260084afd90d0b6989108ccda487ae1cde601c11b204a28d11" exitCode=0 Mar 18 17:58:03 crc kubenswrapper[4939]: I0318 17:58:03.863941 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564278-ffn99" event={"ID":"a6210c6a-418b-404a-8cf3-6e7c916c73db","Type":"ContainerDied","Data":"b54f93a3df000c260084afd90d0b6989108ccda487ae1cde601c11b204a28d11"} Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.384527 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.470335 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cbt7\" (UniqueName: \"kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7\") pod \"a6210c6a-418b-404a-8cf3-6e7c916c73db\" (UID: \"a6210c6a-418b-404a-8cf3-6e7c916c73db\") " Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.476115 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7" (OuterVolumeSpecName: "kube-api-access-5cbt7") pod "a6210c6a-418b-404a-8cf3-6e7c916c73db" (UID: "a6210c6a-418b-404a-8cf3-6e7c916c73db"). InnerVolumeSpecName "kube-api-access-5cbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.572611 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cbt7\" (UniqueName: \"kubernetes.io/projected/a6210c6a-418b-404a-8cf3-6e7c916c73db-kube-api-access-5cbt7\") on node \"crc\" DevicePath \"\"" Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.895376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564278-ffn99" event={"ID":"a6210c6a-418b-404a-8cf3-6e7c916c73db","Type":"ContainerDied","Data":"864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850"} Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.896686 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864a610fa1bd83e7c5ed7ddbeb8aaa67e5cf76b657f1c633f3e9362d04a59850" Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.896761 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564278-ffn99" Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.959572 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564272-c9hhb"] Mar 18 17:58:05 crc kubenswrapper[4939]: I0318 17:58:05.972547 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564272-c9hhb"] Mar 18 17:58:06 crc kubenswrapper[4939]: E0318 17:58:06.097605 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6210c6a_418b_404a_8cf3_6e7c916c73db.slice\": RecentStats: unable to find data in memory cache]" Mar 18 17:58:06 crc kubenswrapper[4939]: I0318 17:58:06.155722 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8397aab3-4c81-4083-a0a6-2538d61ddd42" path="/var/lib/kubelet/pods/8397aab3-4c81-4083-a0a6-2538d61ddd42/volumes" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.035164 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:07 crc kubenswrapper[4939]: E0318 17:58:07.035941 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6210c6a-418b-404a-8cf3-6e7c916c73db" containerName="oc" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.035963 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6210c6a-418b-404a-8cf3-6e7c916c73db" containerName="oc" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.036358 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6210c6a-418b-404a-8cf3-6e7c916c73db" containerName="oc" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.039085 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.053896 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.108895 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.108948 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.108969 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czq7w\" (UniqueName: \"kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.211317 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.211367 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.211393 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czq7w\" (UniqueName: \"kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.213802 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.213996 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.242795 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czq7w\" (UniqueName: \"kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w\") pod \"certified-operators-fwgzw\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.371206 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.899042 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:07 crc kubenswrapper[4939]: W0318 17:58:07.902679 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c677548_f430_4779_bae3_d62315af3dd8.slice/crio-0680b28bbb6c6e6771b77d5efd54610221dbf46d0832debbd3131e1937dd7fd5 WatchSource:0}: Error finding container 0680b28bbb6c6e6771b77d5efd54610221dbf46d0832debbd3131e1937dd7fd5: Status 404 returned error can't find the container with id 0680b28bbb6c6e6771b77d5efd54610221dbf46d0832debbd3131e1937dd7fd5 Mar 18 17:58:07 crc kubenswrapper[4939]: I0318 17:58:07.921177 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerStarted","Data":"0680b28bbb6c6e6771b77d5efd54610221dbf46d0832debbd3131e1937dd7fd5"} Mar 18 17:58:08 crc kubenswrapper[4939]: I0318 17:58:08.937469 4939 generic.go:334] "Generic (PLEG): container finished" podID="6c677548-f430-4779-bae3-d62315af3dd8" containerID="2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174" exitCode=0 Mar 18 17:58:08 crc kubenswrapper[4939]: I0318 17:58:08.937579 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerDied","Data":"2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174"} Mar 18 17:58:10 crc kubenswrapper[4939]: I0318 17:58:10.964624 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerStarted","Data":"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946"} Mar 18 17:58:11 crc kubenswrapper[4939]: I0318 17:58:11.612585 4939 scope.go:117] "RemoveContainer" containerID="eaec9121ab061ae9b94f7676621a01d4ad4a517ef40d04841949aa53c64b76f5" Mar 18 17:58:11 crc kubenswrapper[4939]: I0318 17:58:11.976981 4939 generic.go:334] "Generic (PLEG): container finished" podID="6c677548-f430-4779-bae3-d62315af3dd8" containerID="35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946" exitCode=0 Mar 18 17:58:11 crc kubenswrapper[4939]: I0318 17:58:11.977070 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerDied","Data":"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946"} Mar 18 17:58:12 crc kubenswrapper[4939]: I0318 17:58:12.989866 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerStarted","Data":"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91"} Mar 18 17:58:13 crc kubenswrapper[4939]: I0318 17:58:13.029096 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fwgzw" podStartSLOduration=2.463885808 podStartE2EDuration="6.029065994s" podCreationTimestamp="2026-03-18 17:58:07 +0000 UTC" firstStartedPulling="2026-03-18 17:58:08.940157713 +0000 UTC m=+8453.539345364" lastFinishedPulling="2026-03-18 17:58:12.505337919 +0000 UTC m=+8457.104525550" observedRunningTime="2026-03-18 17:58:13.01242451 +0000 UTC m=+8457.611612141" watchObservedRunningTime="2026-03-18 17:58:13.029065994 +0000 UTC m=+8457.628253655" Mar 18 17:58:17 crc kubenswrapper[4939]: I0318 17:58:17.372462 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:17 crc kubenswrapper[4939]: I0318 17:58:17.373114 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:18 crc kubenswrapper[4939]: I0318 17:58:18.424044 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fwgzw" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="registry-server" probeResult="failure" output=< Mar 18 17:58:18 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 17:58:18 crc kubenswrapper[4939]: > Mar 18 17:58:27 crc kubenswrapper[4939]: I0318 17:58:27.422967 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:27 crc kubenswrapper[4939]: I0318 17:58:27.475439 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:27 crc kubenswrapper[4939]: I0318 17:58:27.671222 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.160555 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fwgzw" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="registry-server" containerID="cri-o://7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91" gracePeriod=2 Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.699080 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.871164 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities\") pod \"6c677548-f430-4779-bae3-d62315af3dd8\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.871349 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czq7w\" (UniqueName: \"kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w\") pod \"6c677548-f430-4779-bae3-d62315af3dd8\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.871412 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content\") pod \"6c677548-f430-4779-bae3-d62315af3dd8\" (UID: \"6c677548-f430-4779-bae3-d62315af3dd8\") " Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.873303 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities" (OuterVolumeSpecName: "utilities") pod "6c677548-f430-4779-bae3-d62315af3dd8" (UID: "6c677548-f430-4779-bae3-d62315af3dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.880099 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w" (OuterVolumeSpecName: "kube-api-access-czq7w") pod "6c677548-f430-4779-bae3-d62315af3dd8" (UID: "6c677548-f430-4779-bae3-d62315af3dd8"). InnerVolumeSpecName "kube-api-access-czq7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.921153 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c677548-f430-4779-bae3-d62315af3dd8" (UID: "6c677548-f430-4779-bae3-d62315af3dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.974558 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.974595 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c677548-f430-4779-bae3-d62315af3dd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:58:29 crc kubenswrapper[4939]: I0318 17:58:29.974608 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czq7w\" (UniqueName: \"kubernetes.io/projected/6c677548-f430-4779-bae3-d62315af3dd8-kube-api-access-czq7w\") on node \"crc\" DevicePath \"\"" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.175624 4939 generic.go:334] "Generic (PLEG): container finished" podID="6c677548-f430-4779-bae3-d62315af3dd8" containerID="7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91" exitCode=0 Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.175669 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerDied","Data":"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91"} Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.175701 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwgzw" event={"ID":"6c677548-f430-4779-bae3-d62315af3dd8","Type":"ContainerDied","Data":"0680b28bbb6c6e6771b77d5efd54610221dbf46d0832debbd3131e1937dd7fd5"} Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.175722 4939 scope.go:117] "RemoveContainer" containerID="7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.175903 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwgzw" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.212821 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.218454 4939 scope.go:117] "RemoveContainer" containerID="35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.224386 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fwgzw"] Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.243075 4939 scope.go:117] "RemoveContainer" containerID="2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.312729 4939 scope.go:117] "RemoveContainer" containerID="7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91" Mar 18 17:58:30 crc kubenswrapper[4939]: E0318 17:58:30.313145 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91\": container with ID starting with 7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91 not found: ID does not exist" containerID="7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.313247 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91"} err="failed to get container status \"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91\": rpc error: code = NotFound desc = could not find container \"7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91\": container with ID starting with 7beae3424abd132e6588cad9c7329abc3e7424cfe3a021902821a647aedfaa91 not found: ID does not exist" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.313321 4939 scope.go:117] "RemoveContainer" containerID="35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946" Mar 18 17:58:30 crc kubenswrapper[4939]: E0318 17:58:30.313935 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946\": container with ID starting with 35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946 not found: ID does not exist" containerID="35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.313983 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946"} err="failed to get container status \"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946\": rpc error: code = NotFound desc = could not find container \"35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946\": container with ID starting with 35c66941e94758cb74e37ba075e03ff1d4f2ad969511d4ac5aa3735f8b888946 not found: ID does not exist" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.314051 4939 scope.go:117] "RemoveContainer" containerID="2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174" Mar 18 17:58:30 crc kubenswrapper[4939]: E0318 17:58:30.314425 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174\": container with ID starting with 2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174 not found: ID does not exist" containerID="2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174" Mar 18 17:58:30 crc kubenswrapper[4939]: I0318 17:58:30.314467 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174"} err="failed to get container status \"2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174\": rpc error: code = NotFound desc = could not find container \"2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174\": container with ID starting with 2456189d520de56fcf4fa9796039619c8c2554a8648bfe458c8a61ad6ff62174 not found: ID does not exist" Mar 18 17:58:32 crc kubenswrapper[4939]: I0318 17:58:32.150055 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c677548-f430-4779-bae3-d62315af3dd8" path="/var/lib/kubelet/pods/6c677548-f430-4779-bae3-d62315af3dd8/volumes" Mar 18 17:59:23 crc kubenswrapper[4939]: I0318 17:59:23.688159 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:59:23 crc kubenswrapper[4939]: I0318 17:59:23.689110 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:59:32 crc kubenswrapper[4939]: I0318 17:59:32.870638 4939 generic.go:334] "Generic (PLEG): container finished" podID="fb138a20-d324-472f-abff-090725f661d8" containerID="b09183b9029122c5d328e1869812304b061777bd9445e63e02bc8dd018058806" exitCode=0 Mar 18 17:59:32 crc kubenswrapper[4939]: I0318 17:59:32.870742 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" event={"ID":"fb138a20-d324-472f-abff-090725f661d8","Type":"ContainerDied","Data":"b09183b9029122c5d328e1869812304b061777bd9445e63e02bc8dd018058806"} Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.382287 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447347 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447542 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447588 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447616 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447808 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447859 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpbf\" (UniqueName: \"kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.447944 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.448042 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1\") pod \"fb138a20-d324-472f-abff-090725f661d8\" (UID: \"fb138a20-d324-472f-abff-090725f661d8\") " Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.453426 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.453936 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph" (OuterVolumeSpecName: "ceph") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.455256 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf" (OuterVolumeSpecName: "kube-api-access-cnpbf") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "kube-api-access-cnpbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.476292 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory" (OuterVolumeSpecName: "inventory") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.481808 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.482696 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.492381 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.508739 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fb138a20-d324-472f-abff-090725f661d8" (UID: "fb138a20-d324-472f-abff-090725f661d8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552379 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552756 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpbf\" (UniqueName: \"kubernetes.io/projected/fb138a20-d324-472f-abff-090725f661d8-kube-api-access-cnpbf\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552773 4939 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552787 4939 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552801 4939 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552813 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552826 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.552839 4939 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb138a20-d324-472f-abff-090725f661d8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.903148 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.904617 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-5jhdk" event={"ID":"fb138a20-d324-472f-abff-090725f661d8","Type":"ContainerDied","Data":"b9972ab95cf16c3b5c290962859eb7a8ff389378e871a78736e2e7fa116b6967"} Mar 18 17:59:34 crc kubenswrapper[4939]: I0318 17:59:34.904676 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9972ab95cf16c3b5c290962859eb7a8ff389378e871a78736e2e7fa116b6967" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.017578 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-r2qr2"] Mar 18 17:59:35 crc kubenswrapper[4939]: E0318 17:59:35.018143 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="extract-content" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018166 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="extract-content" Mar 18 17:59:35 crc kubenswrapper[4939]: E0318 17:59:35.018208 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="extract-utilities" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018221 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="extract-utilities" Mar 18 17:59:35 crc kubenswrapper[4939]: E0318 17:59:35.018242 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="registry-server" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018252 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="registry-server" Mar 18 17:59:35 crc kubenswrapper[4939]: E0318 17:59:35.018296 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb138a20-d324-472f-abff-090725f661d8" containerName="telemetry-openstack-openstack-cell1" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018308 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb138a20-d324-472f-abff-090725f661d8" containerName="telemetry-openstack-openstack-cell1" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018610 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb138a20-d324-472f-abff-090725f661d8" containerName="telemetry-openstack-openstack-cell1" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.018636 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c677548-f430-4779-bae3-d62315af3dd8" containerName="registry-server" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.019564 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.022935 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.023296 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.023297 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.023401 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.023442 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.029533 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-r2qr2"] Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.063812 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9tp\" (UniqueName: \"kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.063904 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.063986 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.064083 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.064196 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.064364 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.166352 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.168971 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.169222 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9tp\" (UniqueName: \"kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.169303 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.169406 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.169487 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.172744 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.174308 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.174709 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.183248 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.183763 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.188793 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9tp\" (UniqueName: \"kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp\") pod \"neutron-sriov-openstack-openstack-cell1-r2qr2\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.351562 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 17:59:35 crc kubenswrapper[4939]: W0318 17:59:35.924687 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330b1db4_a22e_4fe0_a9c5_88ae9458db36.slice/crio-1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85 WatchSource:0}: Error finding container 1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85: Status 404 returned error can't find the container with id 1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85 Mar 18 17:59:35 crc kubenswrapper[4939]: I0318 17:59:35.929467 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-r2qr2"] Mar 18 17:59:36 crc kubenswrapper[4939]: I0318 17:59:36.934037 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" event={"ID":"330b1db4-a22e-4fe0-a9c5-88ae9458db36","Type":"ContainerStarted","Data":"922cee0b62f65e81bc67060f9968242f770e24363d8bb1cb4bda6261e99fa6d6"} Mar 18 17:59:36 crc kubenswrapper[4939]: I0318 17:59:36.934559 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" event={"ID":"330b1db4-a22e-4fe0-a9c5-88ae9458db36","Type":"ContainerStarted","Data":"1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85"} Mar 18 17:59:36 crc kubenswrapper[4939]: I0318 17:59:36.967916 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" podStartSLOduration=2.760165281 podStartE2EDuration="2.967887642s" podCreationTimestamp="2026-03-18 17:59:34 +0000 UTC" firstStartedPulling="2026-03-18 17:59:35.928001838 +0000 UTC m=+8540.527189499" lastFinishedPulling="2026-03-18 17:59:36.135724249 +0000 UTC m=+8540.734911860" observedRunningTime="2026-03-18 17:59:36.966357718 +0000 UTC m=+8541.565545349" watchObservedRunningTime="2026-03-18 17:59:36.967887642 +0000 UTC m=+8541.567075303" Mar 18 17:59:53 crc kubenswrapper[4939]: I0318 17:59:53.688191 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:59:53 crc kubenswrapper[4939]: I0318 17:59:53.688850 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.149883 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564280-hq8lt"] Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.152452 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.152572 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564280-hq8lt"] Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.154661 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.154812 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.157647 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.241745 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m"] Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.243099 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27djn\" (UniqueName: \"kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn\") pod \"auto-csr-approver-29564280-hq8lt\" (UID: \"e5a987df-f080-4aac-bb9a-bcd146d2e35c\") " pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.243535 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.245523 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.245577 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.252906 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m"] Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.344587 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5nt\" (UniqueName: \"kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.345044 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27djn\" (UniqueName: \"kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn\") pod \"auto-csr-approver-29564280-hq8lt\" (UID: \"e5a987df-f080-4aac-bb9a-bcd146d2e35c\") " pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.345088 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.345169 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.365183 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27djn\" (UniqueName: \"kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn\") pod \"auto-csr-approver-29564280-hq8lt\" (UID: \"e5a987df-f080-4aac-bb9a-bcd146d2e35c\") " pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.447764 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.447846 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.447909 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5nt\" (UniqueName: \"kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.448681 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.452002 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.468208 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5nt\" (UniqueName: \"kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt\") pod \"collect-profiles-29564280-l7z5m\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.488041 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.560365 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:00 crc kubenswrapper[4939]: I0318 18:00:00.957179 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564280-hq8lt"] Mar 18 18:00:01 crc kubenswrapper[4939]: W0318 18:00:01.106210 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb40560ae_a4c1_423b_b613_546869606d90.slice/crio-016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d WatchSource:0}: Error finding container 016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d: Status 404 returned error can't find the container with id 016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d Mar 18 18:00:01 crc kubenswrapper[4939]: I0318 18:00:01.109212 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m"] Mar 18 18:00:01 crc kubenswrapper[4939]: I0318 18:00:01.286285 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" event={"ID":"e5a987df-f080-4aac-bb9a-bcd146d2e35c","Type":"ContainerStarted","Data":"557056d5bd9673e60a0dc517402eeb1df5ff32bce038ac9de381d281f6218104"} Mar 18 18:00:01 crc kubenswrapper[4939]: I0318 18:00:01.287992 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" event={"ID":"b40560ae-a4c1-423b-b613-546869606d90","Type":"ContainerStarted","Data":"016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d"} Mar 18 18:00:02 crc kubenswrapper[4939]: I0318 18:00:02.308568 4939 generic.go:334] "Generic (PLEG): container finished" podID="b40560ae-a4c1-423b-b613-546869606d90" containerID="32e21f2fe268cda60b5c351b8dda23501951e5f2f45e30aae4686558db410c75" exitCode=0 Mar 18 18:00:02 crc kubenswrapper[4939]: I0318 18:00:02.308636 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" event={"ID":"b40560ae-a4c1-423b-b613-546869606d90","Type":"ContainerDied","Data":"32e21f2fe268cda60b5c351b8dda23501951e5f2f45e30aae4686558db410c75"} Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.756442 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.856786 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj5nt\" (UniqueName: \"kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt\") pod \"b40560ae-a4c1-423b-b613-546869606d90\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.856853 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume\") pod \"b40560ae-a4c1-423b-b613-546869606d90\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.856960 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume\") pod \"b40560ae-a4c1-423b-b613-546869606d90\" (UID: \"b40560ae-a4c1-423b-b613-546869606d90\") " Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.858364 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume" (OuterVolumeSpecName: "config-volume") pod "b40560ae-a4c1-423b-b613-546869606d90" (UID: "b40560ae-a4c1-423b-b613-546869606d90"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.863636 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b40560ae-a4c1-423b-b613-546869606d90" (UID: "b40560ae-a4c1-423b-b613-546869606d90"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.865577 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt" (OuterVolumeSpecName: "kube-api-access-sj5nt") pod "b40560ae-a4c1-423b-b613-546869606d90" (UID: "b40560ae-a4c1-423b-b613-546869606d90"). InnerVolumeSpecName "kube-api-access-sj5nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.959488 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj5nt\" (UniqueName: \"kubernetes.io/projected/b40560ae-a4c1-423b-b613-546869606d90-kube-api-access-sj5nt\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.959840 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b40560ae-a4c1-423b-b613-546869606d90-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:03 crc kubenswrapper[4939]: I0318 18:00:03.959860 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b40560ae-a4c1-423b-b613-546869606d90-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:04 crc kubenswrapper[4939]: I0318 18:00:04.342383 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" event={"ID":"b40560ae-a4c1-423b-b613-546869606d90","Type":"ContainerDied","Data":"016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d"} Mar 18 18:00:04 crc kubenswrapper[4939]: I0318 18:00:04.342439 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="016c578273f232975a2a6934bea455ee664a4ff0b0644c8d18ffd08c0ba4c01d" Mar 18 18:00:04 crc kubenswrapper[4939]: I0318 18:00:04.342565 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-l7z5m" Mar 18 18:00:04 crc kubenswrapper[4939]: I0318 18:00:04.846400 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2"] Mar 18 18:00:04 crc kubenswrapper[4939]: I0318 18:00:04.858749 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-qckv2"] Mar 18 18:00:05 crc kubenswrapper[4939]: I0318 18:00:05.383007 4939 generic.go:334] "Generic (PLEG): container finished" podID="e5a987df-f080-4aac-bb9a-bcd146d2e35c" containerID="aa0fcdb56a6349370a501ba2a4d8a900eab24d28560efccf95b5399edd0a7334" exitCode=0 Mar 18 18:00:05 crc kubenswrapper[4939]: I0318 18:00:05.383082 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" event={"ID":"e5a987df-f080-4aac-bb9a-bcd146d2e35c","Type":"ContainerDied","Data":"aa0fcdb56a6349370a501ba2a4d8a900eab24d28560efccf95b5399edd0a7334"} Mar 18 18:00:06 crc kubenswrapper[4939]: I0318 18:00:06.155420 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d087fd-bcf9-4bac-96c2-5f6d77200b3d" path="/var/lib/kubelet/pods/42d087fd-bcf9-4bac-96c2-5f6d77200b3d/volumes" Mar 18 18:00:06 crc kubenswrapper[4939]: I0318 18:00:06.830361 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:06 crc kubenswrapper[4939]: I0318 18:00:06.933751 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27djn\" (UniqueName: \"kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn\") pod \"e5a987df-f080-4aac-bb9a-bcd146d2e35c\" (UID: \"e5a987df-f080-4aac-bb9a-bcd146d2e35c\") " Mar 18 18:00:06 crc kubenswrapper[4939]: I0318 18:00:06.939478 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn" (OuterVolumeSpecName: "kube-api-access-27djn") pod "e5a987df-f080-4aac-bb9a-bcd146d2e35c" (UID: "e5a987df-f080-4aac-bb9a-bcd146d2e35c"). InnerVolumeSpecName "kube-api-access-27djn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.037765 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27djn\" (UniqueName: \"kubernetes.io/projected/e5a987df-f080-4aac-bb9a-bcd146d2e35c-kube-api-access-27djn\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.417135 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" event={"ID":"e5a987df-f080-4aac-bb9a-bcd146d2e35c","Type":"ContainerDied","Data":"557056d5bd9673e60a0dc517402eeb1df5ff32bce038ac9de381d281f6218104"} Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.417196 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="557056d5bd9673e60a0dc517402eeb1df5ff32bce038ac9de381d281f6218104" Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.417227 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564280-hq8lt" Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.899292 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564274-mtz6c"] Mar 18 18:00:07 crc kubenswrapper[4939]: I0318 18:00:07.915406 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564274-mtz6c"] Mar 18 18:00:08 crc kubenswrapper[4939]: I0318 18:00:08.156396 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccbf094-692b-4efc-89b1-8091f16c6521" path="/var/lib/kubelet/pods/2ccbf094-692b-4efc-89b1-8091f16c6521/volumes" Mar 18 18:00:11 crc kubenswrapper[4939]: I0318 18:00:11.744296 4939 scope.go:117] "RemoveContainer" containerID="6f8c7830d82224d3bc838ad8bd9f3755739b29f39937552280d6b48ae71adccb" Mar 18 18:00:12 crc kubenswrapper[4939]: I0318 18:00:12.423862 4939 scope.go:117] "RemoveContainer" containerID="99c26977dde7e049562192a6d91a202ea8c0a9efa4cd6a9e30c7393fd6c8e997" Mar 18 18:00:23 crc kubenswrapper[4939]: I0318 18:00:23.687778 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:00:23 crc kubenswrapper[4939]: I0318 18:00:23.688286 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:00:23 crc kubenswrapper[4939]: I0318 18:00:23.688322 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:00:23 crc kubenswrapper[4939]: I0318 18:00:23.689067 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:00:23 crc kubenswrapper[4939]: I0318 18:00:23.689112 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a" gracePeriod=600 Mar 18 18:00:24 crc kubenswrapper[4939]: I0318 18:00:24.632757 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a" exitCode=0 Mar 18 18:00:24 crc kubenswrapper[4939]: I0318 18:00:24.632838 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a"} Mar 18 18:00:24 crc kubenswrapper[4939]: I0318 18:00:24.633371 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15"} Mar 18 18:00:24 crc kubenswrapper[4939]: I0318 18:00:24.633396 4939 scope.go:117] "RemoveContainer" containerID="67e9c0e2d77e2ff62ef2b7ff0b1c51d7991da721b9dadb2640973c331291844b" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.965333 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:28 crc kubenswrapper[4939]: E0318 18:00:28.966402 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40560ae-a4c1-423b-b613-546869606d90" containerName="collect-profiles" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.966418 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40560ae-a4c1-423b-b613-546869606d90" containerName="collect-profiles" Mar 18 18:00:28 crc kubenswrapper[4939]: E0318 18:00:28.966440 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a987df-f080-4aac-bb9a-bcd146d2e35c" containerName="oc" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.966448 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a987df-f080-4aac-bb9a-bcd146d2e35c" containerName="oc" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.966731 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40560ae-a4c1-423b-b613-546869606d90" containerName="collect-profiles" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.966755 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a987df-f080-4aac-bb9a-bcd146d2e35c" containerName="oc" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.968754 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:28 crc kubenswrapper[4939]: I0318 18:00:28.978885 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.089734 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.090061 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpb6\" (UniqueName: \"kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.090305 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.192300 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swpb6\" (UniqueName: \"kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.192368 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.192458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.192935 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.193066 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.214821 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpb6\" (UniqueName: \"kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6\") pod \"community-operators-bpd7g\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.338179 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:29 crc kubenswrapper[4939]: I0318 18:00:29.902852 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:30 crc kubenswrapper[4939]: E0318 18:00:30.280219 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9832e6ba_2198_47b1_8e58_999181c49392.slice/crio-conmon-9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:00:30 crc kubenswrapper[4939]: I0318 18:00:30.718218 4939 generic.go:334] "Generic (PLEG): container finished" podID="9832e6ba-2198-47b1-8e58-999181c49392" containerID="9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570" exitCode=0 Mar 18 18:00:30 crc kubenswrapper[4939]: I0318 18:00:30.718266 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerDied","Data":"9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570"} Mar 18 18:00:30 crc kubenswrapper[4939]: I0318 18:00:30.718299 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerStarted","Data":"2bc00f55886fc5e1ad99f18bdbffc503aa0dc411b90011e4799082685c2849ff"} Mar 18 18:00:31 crc kubenswrapper[4939]: I0318 18:00:31.731694 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerStarted","Data":"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d"} Mar 18 18:00:33 crc kubenswrapper[4939]: I0318 18:00:33.761940 4939 generic.go:334] "Generic (PLEG): container finished" podID="9832e6ba-2198-47b1-8e58-999181c49392" containerID="c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d" exitCode=0 Mar 18 18:00:33 crc kubenswrapper[4939]: I0318 18:00:33.762052 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerDied","Data":"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d"} Mar 18 18:00:34 crc kubenswrapper[4939]: I0318 18:00:34.775765 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerStarted","Data":"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e"} Mar 18 18:00:34 crc kubenswrapper[4939]: I0318 18:00:34.804090 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpd7g" podStartSLOduration=3.245539627 podStartE2EDuration="6.804063489s" podCreationTimestamp="2026-03-18 18:00:28 +0000 UTC" firstStartedPulling="2026-03-18 18:00:30.72127557 +0000 UTC m=+8595.320463221" lastFinishedPulling="2026-03-18 18:00:34.279799462 +0000 UTC m=+8598.878987083" observedRunningTime="2026-03-18 18:00:34.797199143 +0000 UTC m=+8599.396386764" watchObservedRunningTime="2026-03-18 18:00:34.804063489 +0000 UTC m=+8599.403251110" Mar 18 18:00:39 crc kubenswrapper[4939]: I0318 18:00:39.339300 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:39 crc kubenswrapper[4939]: I0318 18:00:39.340025 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:39 crc kubenswrapper[4939]: I0318 18:00:39.398495 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:39 crc kubenswrapper[4939]: I0318 18:00:39.930701 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:39 crc kubenswrapper[4939]: I0318 18:00:39.976340 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:41 crc kubenswrapper[4939]: I0318 18:00:41.881692 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bpd7g" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="registry-server" containerID="cri-o://80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e" gracePeriod=2 Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.462365 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.536980 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities\") pod \"9832e6ba-2198-47b1-8e58-999181c49392\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.537073 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swpb6\" (UniqueName: \"kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6\") pod \"9832e6ba-2198-47b1-8e58-999181c49392\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.537348 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content\") pod \"9832e6ba-2198-47b1-8e58-999181c49392\" (UID: \"9832e6ba-2198-47b1-8e58-999181c49392\") " Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.538762 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities" (OuterVolumeSpecName: "utilities") pod "9832e6ba-2198-47b1-8e58-999181c49392" (UID: "9832e6ba-2198-47b1-8e58-999181c49392"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.547099 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6" (OuterVolumeSpecName: "kube-api-access-swpb6") pod "9832e6ba-2198-47b1-8e58-999181c49392" (UID: "9832e6ba-2198-47b1-8e58-999181c49392"). InnerVolumeSpecName "kube-api-access-swpb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.613540 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9832e6ba-2198-47b1-8e58-999181c49392" (UID: "9832e6ba-2198-47b1-8e58-999181c49392"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.641055 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.641118 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swpb6\" (UniqueName: \"kubernetes.io/projected/9832e6ba-2198-47b1-8e58-999181c49392-kube-api-access-swpb6\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.641135 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9832e6ba-2198-47b1-8e58-999181c49392-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.896057 4939 generic.go:334] "Generic (PLEG): container finished" podID="9832e6ba-2198-47b1-8e58-999181c49392" containerID="80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e" exitCode=0 Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.896122 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpd7g" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.896157 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerDied","Data":"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e"} Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.896553 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpd7g" event={"ID":"9832e6ba-2198-47b1-8e58-999181c49392","Type":"ContainerDied","Data":"2bc00f55886fc5e1ad99f18bdbffc503aa0dc411b90011e4799082685c2849ff"} Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.896575 4939 scope.go:117] "RemoveContainer" containerID="80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.933710 4939 scope.go:117] "RemoveContainer" containerID="c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d" Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.939816 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.951340 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bpd7g"] Mar 18 18:00:42 crc kubenswrapper[4939]: I0318 18:00:42.972878 4939 scope.go:117] "RemoveContainer" containerID="9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.029067 4939 scope.go:117] "RemoveContainer" containerID="80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e" Mar 18 18:00:43 crc kubenswrapper[4939]: E0318 18:00:43.029727 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e\": container with ID starting with 80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e not found: ID does not exist" containerID="80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.029754 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e"} err="failed to get container status \"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e\": rpc error: code = NotFound desc = could not find container \"80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e\": container with ID starting with 80a223cc5e0e8cbef9c20ec72828b628d22f42aff8e22f8d82bdf739963b359e not found: ID does not exist" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.029775 4939 scope.go:117] "RemoveContainer" containerID="c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d" Mar 18 18:00:43 crc kubenswrapper[4939]: E0318 18:00:43.030214 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d\": container with ID starting with c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d not found: ID does not exist" containerID="c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.030235 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d"} err="failed to get container status \"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d\": rpc error: code = NotFound desc = could not find container \"c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d\": container with ID starting with c673e7b53af00d23dc3e2690c881877109181f20cb8797c8ce1203f3e2ccca6d not found: ID does not exist" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.030247 4939 scope.go:117] "RemoveContainer" containerID="9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570" Mar 18 18:00:43 crc kubenswrapper[4939]: E0318 18:00:43.030755 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570\": container with ID starting with 9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570 not found: ID does not exist" containerID="9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570" Mar 18 18:00:43 crc kubenswrapper[4939]: I0318 18:00:43.030956 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570"} err="failed to get container status \"9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570\": rpc error: code = NotFound desc = could not find container \"9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570\": container with ID starting with 9fd734f04b127a8123e2abaa6142b3a1668f32da4886a53c14dec7762e19b570 not found: ID does not exist" Mar 18 18:00:44 crc kubenswrapper[4939]: I0318 18:00:44.155132 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9832e6ba-2198-47b1-8e58-999181c49392" path="/var/lib/kubelet/pods/9832e6ba-2198-47b1-8e58-999181c49392/volumes" Mar 18 18:00:49 crc kubenswrapper[4939]: I0318 18:00:49.994209 4939 generic.go:334] "Generic (PLEG): container finished" podID="330b1db4-a22e-4fe0-a9c5-88ae9458db36" containerID="922cee0b62f65e81bc67060f9968242f770e24363d8bb1cb4bda6261e99fa6d6" exitCode=0 Mar 18 18:00:49 crc kubenswrapper[4939]: I0318 18:00:49.994303 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" event={"ID":"330b1db4-a22e-4fe0-a9c5-88ae9458db36","Type":"ContainerDied","Data":"922cee0b62f65e81bc67060f9968242f770e24363d8bb1cb4bda6261e99fa6d6"} Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.579793 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.588231 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.589090 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.589157 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9tp\" (UniqueName: \"kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.589333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.589411 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.589593 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle\") pod \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\" (UID: \"330b1db4-a22e-4fe0-a9c5-88ae9458db36\") " Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.600895 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp" (OuterVolumeSpecName: "kube-api-access-7r9tp") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "kube-api-access-7r9tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.601232 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph" (OuterVolumeSpecName: "ceph") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.604830 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.636554 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory" (OuterVolumeSpecName: "inventory") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.647341 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.649683 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "330b1db4-a22e-4fe0-a9c5-88ae9458db36" (UID: "330b1db4-a22e-4fe0-a9c5-88ae9458db36"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692069 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692101 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692112 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692138 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692148 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/330b1db4-a22e-4fe0-a9c5-88ae9458db36-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:51 crc kubenswrapper[4939]: I0318 18:00:51.692157 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r9tp\" (UniqueName: \"kubernetes.io/projected/330b1db4-a22e-4fe0-a9c5-88ae9458db36-kube-api-access-7r9tp\") on node \"crc\" DevicePath \"\"" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.024095 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" event={"ID":"330b1db4-a22e-4fe0-a9c5-88ae9458db36","Type":"ContainerDied","Data":"1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85"} Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.024161 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da94c64df22d78c84cbe081c845994154205540b0e12c17971864bfe5fa1e85" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.024190 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-r2qr2" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.174437 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4"] Mar 18 18:00:52 crc kubenswrapper[4939]: E0318 18:00:52.175406 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="registry-server" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.175441 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="registry-server" Mar 18 18:00:52 crc kubenswrapper[4939]: E0318 18:00:52.175481 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="extract-utilities" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.175496 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="extract-utilities" Mar 18 18:00:52 crc kubenswrapper[4939]: E0318 18:00:52.175580 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330b1db4-a22e-4fe0-a9c5-88ae9458db36" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.175594 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="330b1db4-a22e-4fe0-a9c5-88ae9458db36" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 18:00:52 crc kubenswrapper[4939]: E0318 18:00:52.175640 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="extract-content" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.175687 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="extract-content" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.176085 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9832e6ba-2198-47b1-8e58-999181c49392" containerName="registry-server" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.176154 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="330b1db4-a22e-4fe0-a9c5-88ae9458db36" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.177433 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.187826 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.188799 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.188938 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.189008 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.189071 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.194373 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4"] Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205522 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205569 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpg9s\" (UniqueName: \"kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205613 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205659 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205737 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.205921 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.308421 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.308473 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpg9s\" (UniqueName: \"kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.308522 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.308549 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.308592 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.309442 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.313729 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.314205 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.316305 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.317236 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.321118 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.325005 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpg9s\" (UniqueName: \"kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s\") pod \"neutron-dhcp-openstack-openstack-cell1-2lrx4\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:52 crc kubenswrapper[4939]: I0318 18:00:52.514560 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:00:53 crc kubenswrapper[4939]: I0318 18:00:53.139955 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:00:53 crc kubenswrapper[4939]: I0318 18:00:53.153917 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4"] Mar 18 18:00:54 crc kubenswrapper[4939]: I0318 18:00:54.046109 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" event={"ID":"cbafcc37-e8cb-4cb5-96aa-00a063f4c003","Type":"ContainerStarted","Data":"523af0ab368525b53b71f18d681cabafdc7a777c7616789debdf1b200ea743e6"} Mar 18 18:00:54 crc kubenswrapper[4939]: I0318 18:00:54.046694 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" event={"ID":"cbafcc37-e8cb-4cb5-96aa-00a063f4c003","Type":"ContainerStarted","Data":"02bc126b998f2551e7be1020d155021ced83ae2c772dc39d9ab62f5b6672e857"} Mar 18 18:00:54 crc kubenswrapper[4939]: I0318 18:00:54.074547 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" podStartSLOduration=1.91584384 podStartE2EDuration="2.074492481s" podCreationTimestamp="2026-03-18 18:00:52 +0000 UTC" firstStartedPulling="2026-03-18 18:00:53.139578818 +0000 UTC m=+8617.738766439" lastFinishedPulling="2026-03-18 18:00:53.298227459 +0000 UTC m=+8617.897415080" observedRunningTime="2026-03-18 18:00:54.063484468 +0000 UTC m=+8618.662672129" watchObservedRunningTime="2026-03-18 18:00:54.074492481 +0000 UTC m=+8618.673680142" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.159862 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564281-dvxjc"] Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.162692 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.202773 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564281-dvxjc"] Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.286658 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k2v4\" (UniqueName: \"kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.286910 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.287182 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.287250 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.396310 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.396424 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.396456 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.396619 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k2v4\" (UniqueName: \"kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.406689 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.408985 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.413774 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.415898 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k2v4\" (UniqueName: \"kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4\") pod \"keystone-cron-29564281-dvxjc\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.504586 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:00 crc kubenswrapper[4939]: I0318 18:01:00.989174 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564281-dvxjc"] Mar 18 18:01:01 crc kubenswrapper[4939]: I0318 18:01:01.125122 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564281-dvxjc" event={"ID":"b3681d41-eec6-4744-83d9-25b409e189d6","Type":"ContainerStarted","Data":"01cbabe58bd2539e05949fe8da6e25c7c1609853e42a64c5ab09b4eeb09840b6"} Mar 18 18:01:02 crc kubenswrapper[4939]: I0318 18:01:02.161125 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564281-dvxjc" event={"ID":"b3681d41-eec6-4744-83d9-25b409e189d6","Type":"ContainerStarted","Data":"84d7acbfa744b33f56ee4517d23a8c11067221c816acd2f1f05b53c47dd67cef"} Mar 18 18:01:02 crc kubenswrapper[4939]: I0318 18:01:02.177557 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564281-dvxjc" podStartSLOduration=2.177539083 podStartE2EDuration="2.177539083s" podCreationTimestamp="2026-03-18 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:01:02.172300134 +0000 UTC m=+8626.771487785" watchObservedRunningTime="2026-03-18 18:01:02.177539083 +0000 UTC m=+8626.776726704" Mar 18 18:01:05 crc kubenswrapper[4939]: I0318 18:01:05.240589 4939 generic.go:334] "Generic (PLEG): container finished" podID="b3681d41-eec6-4744-83d9-25b409e189d6" containerID="84d7acbfa744b33f56ee4517d23a8c11067221c816acd2f1f05b53c47dd67cef" exitCode=0 Mar 18 18:01:05 crc kubenswrapper[4939]: I0318 18:01:05.240692 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564281-dvxjc" event={"ID":"b3681d41-eec6-4744-83d9-25b409e189d6","Type":"ContainerDied","Data":"84d7acbfa744b33f56ee4517d23a8c11067221c816acd2f1f05b53c47dd67cef"} Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.808953 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.981300 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k2v4\" (UniqueName: \"kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4\") pod \"b3681d41-eec6-4744-83d9-25b409e189d6\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.981604 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data\") pod \"b3681d41-eec6-4744-83d9-25b409e189d6\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.981627 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle\") pod \"b3681d41-eec6-4744-83d9-25b409e189d6\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.981690 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys\") pod \"b3681d41-eec6-4744-83d9-25b409e189d6\" (UID: \"b3681d41-eec6-4744-83d9-25b409e189d6\") " Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.988440 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b3681d41-eec6-4744-83d9-25b409e189d6" (UID: "b3681d41-eec6-4744-83d9-25b409e189d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:01:06 crc kubenswrapper[4939]: I0318 18:01:06.993847 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4" (OuterVolumeSpecName: "kube-api-access-7k2v4") pod "b3681d41-eec6-4744-83d9-25b409e189d6" (UID: "b3681d41-eec6-4744-83d9-25b409e189d6"). InnerVolumeSpecName "kube-api-access-7k2v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.029201 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3681d41-eec6-4744-83d9-25b409e189d6" (UID: "b3681d41-eec6-4744-83d9-25b409e189d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.045819 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data" (OuterVolumeSpecName: "config-data") pod "b3681d41-eec6-4744-83d9-25b409e189d6" (UID: "b3681d41-eec6-4744-83d9-25b409e189d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.085554 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k2v4\" (UniqueName: \"kubernetes.io/projected/b3681d41-eec6-4744-83d9-25b409e189d6-kube-api-access-7k2v4\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.085588 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.085602 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.085613 4939 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3681d41-eec6-4744-83d9-25b409e189d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.269892 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564281-dvxjc" event={"ID":"b3681d41-eec6-4744-83d9-25b409e189d6","Type":"ContainerDied","Data":"01cbabe58bd2539e05949fe8da6e25c7c1609853e42a64c5ab09b4eeb09840b6"} Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.269955 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01cbabe58bd2539e05949fe8da6e25c7c1609853e42a64c5ab09b4eeb09840b6" Mar 18 18:01:07 crc kubenswrapper[4939]: I0318 18:01:07.270195 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564281-dvxjc" Mar 18 18:01:27 crc kubenswrapper[4939]: I0318 18:01:27.836641 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:27 crc kubenswrapper[4939]: E0318 18:01:27.838747 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3681d41-eec6-4744-83d9-25b409e189d6" containerName="keystone-cron" Mar 18 18:01:27 crc kubenswrapper[4939]: I0318 18:01:27.838785 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3681d41-eec6-4744-83d9-25b409e189d6" containerName="keystone-cron" Mar 18 18:01:27 crc kubenswrapper[4939]: I0318 18:01:27.839400 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3681d41-eec6-4744-83d9-25b409e189d6" containerName="keystone-cron" Mar 18 18:01:27 crc kubenswrapper[4939]: I0318 18:01:27.843598 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:27 crc kubenswrapper[4939]: I0318 18:01:27.868882 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.003674 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7kh\" (UniqueName: \"kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.003781 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.003846 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.105400 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7kh\" (UniqueName: \"kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.105528 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.105598 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.106373 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.106362 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.139125 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7kh\" (UniqueName: \"kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh\") pod \"redhat-marketplace-5rh4r\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.190584 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:28 crc kubenswrapper[4939]: I0318 18:01:28.697528 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:28 crc kubenswrapper[4939]: W0318 18:01:28.707965 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6205c650_5700_4699_884c_c1895ba4cd3d.slice/crio-a79f5c87c3bc8a5087612c0a25c9af7f1f0c608ffbe682b99a6401b184f2437d WatchSource:0}: Error finding container a79f5c87c3bc8a5087612c0a25c9af7f1f0c608ffbe682b99a6401b184f2437d: Status 404 returned error can't find the container with id a79f5c87c3bc8a5087612c0a25c9af7f1f0c608ffbe682b99a6401b184f2437d Mar 18 18:01:29 crc kubenswrapper[4939]: I0318 18:01:29.544267 4939 generic.go:334] "Generic (PLEG): container finished" podID="6205c650-5700-4699-884c-c1895ba4cd3d" containerID="9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b" exitCode=0 Mar 18 18:01:29 crc kubenswrapper[4939]: I0318 18:01:29.544326 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerDied","Data":"9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b"} Mar 18 18:01:29 crc kubenswrapper[4939]: I0318 18:01:29.544565 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerStarted","Data":"a79f5c87c3bc8a5087612c0a25c9af7f1f0c608ffbe682b99a6401b184f2437d"} Mar 18 18:01:30 crc kubenswrapper[4939]: I0318 18:01:30.561971 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerStarted","Data":"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93"} Mar 18 18:01:32 crc kubenswrapper[4939]: E0318 18:01:32.074075 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6205c650_5700_4699_884c_c1895ba4cd3d.slice/crio-conmon-a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:01:32 crc kubenswrapper[4939]: I0318 18:01:32.585763 4939 generic.go:334] "Generic (PLEG): container finished" podID="6205c650-5700-4699-884c-c1895ba4cd3d" containerID="a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93" exitCode=0 Mar 18 18:01:32 crc kubenswrapper[4939]: I0318 18:01:32.585801 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerDied","Data":"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93"} Mar 18 18:01:33 crc kubenswrapper[4939]: I0318 18:01:33.596558 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerStarted","Data":"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6"} Mar 18 18:01:33 crc kubenswrapper[4939]: I0318 18:01:33.622455 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rh4r" podStartSLOduration=3.169663574 podStartE2EDuration="6.62243464s" podCreationTimestamp="2026-03-18 18:01:27 +0000 UTC" firstStartedPulling="2026-03-18 18:01:29.546223788 +0000 UTC m=+8654.145411409" lastFinishedPulling="2026-03-18 18:01:32.998994844 +0000 UTC m=+8657.598182475" observedRunningTime="2026-03-18 18:01:33.613366443 +0000 UTC m=+8658.212554074" watchObservedRunningTime="2026-03-18 18:01:33.62243464 +0000 UTC m=+8658.221622261" Mar 18 18:01:38 crc kubenswrapper[4939]: I0318 18:01:38.191221 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:38 crc kubenswrapper[4939]: I0318 18:01:38.192645 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:38 crc kubenswrapper[4939]: I0318 18:01:38.298675 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:38 crc kubenswrapper[4939]: I0318 18:01:38.729700 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:39 crc kubenswrapper[4939]: I0318 18:01:39.820924 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:41 crc kubenswrapper[4939]: I0318 18:01:41.685329 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rh4r" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="registry-server" containerID="cri-o://aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6" gracePeriod=2 Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.165805 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.211976 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content\") pod \"6205c650-5700-4699-884c-c1895ba4cd3d\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.212256 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7kh\" (UniqueName: \"kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh\") pod \"6205c650-5700-4699-884c-c1895ba4cd3d\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.212411 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities\") pod \"6205c650-5700-4699-884c-c1895ba4cd3d\" (UID: \"6205c650-5700-4699-884c-c1895ba4cd3d\") " Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.216203 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities" (OuterVolumeSpecName: "utilities") pod "6205c650-5700-4699-884c-c1895ba4cd3d" (UID: "6205c650-5700-4699-884c-c1895ba4cd3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.223626 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh" (OuterVolumeSpecName: "kube-api-access-mp7kh") pod "6205c650-5700-4699-884c-c1895ba4cd3d" (UID: "6205c650-5700-4699-884c-c1895ba4cd3d"). InnerVolumeSpecName "kube-api-access-mp7kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.249215 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6205c650-5700-4699-884c-c1895ba4cd3d" (UID: "6205c650-5700-4699-884c-c1895ba4cd3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.316702 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.316736 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6205c650-5700-4699-884c-c1895ba4cd3d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.316751 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7kh\" (UniqueName: \"kubernetes.io/projected/6205c650-5700-4699-884c-c1895ba4cd3d-kube-api-access-mp7kh\") on node \"crc\" DevicePath \"\"" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.707643 4939 generic.go:334] "Generic (PLEG): container finished" podID="6205c650-5700-4699-884c-c1895ba4cd3d" containerID="aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6" exitCode=0 Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.707692 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerDied","Data":"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6"} Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.707721 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rh4r" event={"ID":"6205c650-5700-4699-884c-c1895ba4cd3d","Type":"ContainerDied","Data":"a79f5c87c3bc8a5087612c0a25c9af7f1f0c608ffbe682b99a6401b184f2437d"} Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.707742 4939 scope.go:117] "RemoveContainer" containerID="aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.707864 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rh4r" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.763969 4939 scope.go:117] "RemoveContainer" containerID="a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.781851 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.796290 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rh4r"] Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.813473 4939 scope.go:117] "RemoveContainer" containerID="9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.881675 4939 scope.go:117] "RemoveContainer" containerID="aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6" Mar 18 18:01:42 crc kubenswrapper[4939]: E0318 18:01:42.882245 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6\": container with ID starting with aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6 not found: ID does not exist" containerID="aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.882284 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6"} err="failed to get container status \"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6\": rpc error: code = NotFound desc = could not find container \"aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6\": container with ID starting with aa564847b720842cbee3b57ac8f199a5fdb0aff21fedfdcb57bd22a660c940a6 not found: ID does not exist" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.882310 4939 scope.go:117] "RemoveContainer" containerID="a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93" Mar 18 18:01:42 crc kubenswrapper[4939]: E0318 18:01:42.882861 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93\": container with ID starting with a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93 not found: ID does not exist" containerID="a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.882918 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93"} err="failed to get container status \"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93\": rpc error: code = NotFound desc = could not find container \"a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93\": container with ID starting with a53edcd7586cd0236ffe1f2a06c9f7552ac40cadeeefad9ccb29078c95237b93 not found: ID does not exist" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.882955 4939 scope.go:117] "RemoveContainer" containerID="9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b" Mar 18 18:01:42 crc kubenswrapper[4939]: E0318 18:01:42.883326 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b\": container with ID starting with 9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b not found: ID does not exist" containerID="9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b" Mar 18 18:01:42 crc kubenswrapper[4939]: I0318 18:01:42.883356 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b"} err="failed to get container status \"9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b\": rpc error: code = NotFound desc = could not find container \"9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b\": container with ID starting with 9744ec521562ab6c5d0e6d4868a655f053845b88862adb356849d1091482631b not found: ID does not exist" Mar 18 18:01:44 crc kubenswrapper[4939]: I0318 18:01:44.164248 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" path="/var/lib/kubelet/pods/6205c650-5700-4699-884c-c1895ba4cd3d/volumes" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.174791 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564282-5tzll"] Mar 18 18:02:00 crc kubenswrapper[4939]: E0318 18:02:00.176729 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="extract-utilities" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.176753 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="extract-utilities" Mar 18 18:02:00 crc kubenswrapper[4939]: E0318 18:02:00.176802 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="registry-server" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.176813 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="registry-server" Mar 18 18:02:00 crc kubenswrapper[4939]: E0318 18:02:00.176881 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="extract-content" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.176894 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="extract-content" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.177404 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6205c650-5700-4699-884c-c1895ba4cd3d" containerName="registry-server" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.178909 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.183081 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.183117 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.183446 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.186185 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564282-5tzll"] Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.361630 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5jc8\" (UniqueName: \"kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8\") pod \"auto-csr-approver-29564282-5tzll\" (UID: \"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d\") " pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.463458 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5jc8\" (UniqueName: \"kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8\") pod \"auto-csr-approver-29564282-5tzll\" (UID: \"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d\") " pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.486460 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5jc8\" (UniqueName: \"kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8\") pod \"auto-csr-approver-29564282-5tzll\" (UID: \"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d\") " pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:00 crc kubenswrapper[4939]: I0318 18:02:00.520258 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:01 crc kubenswrapper[4939]: I0318 18:02:01.041084 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564282-5tzll"] Mar 18 18:02:01 crc kubenswrapper[4939]: I0318 18:02:01.969310 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564282-5tzll" event={"ID":"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d","Type":"ContainerStarted","Data":"1093e5888e9da13ebf8e17b74a588d6135912168827edf194ed3b417a1ec92ce"} Mar 18 18:02:02 crc kubenswrapper[4939]: I0318 18:02:02.995429 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564282-5tzll" event={"ID":"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d","Type":"ContainerStarted","Data":"3219156d1cb508b7a221950fb3a4ddcd6315bd157d0a575855b6542b7644cb46"} Mar 18 18:02:03 crc kubenswrapper[4939]: I0318 18:02:03.018992 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564282-5tzll" podStartSLOduration=1.420113043 podStartE2EDuration="3.018954475s" podCreationTimestamp="2026-03-18 18:02:00 +0000 UTC" firstStartedPulling="2026-03-18 18:02:01.043136074 +0000 UTC m=+8685.642323705" lastFinishedPulling="2026-03-18 18:02:02.641977516 +0000 UTC m=+8687.241165137" observedRunningTime="2026-03-18 18:02:03.012208923 +0000 UTC m=+8687.611396554" watchObservedRunningTime="2026-03-18 18:02:03.018954475 +0000 UTC m=+8687.618142136" Mar 18 18:02:04 crc kubenswrapper[4939]: I0318 18:02:04.014938 4939 generic.go:334] "Generic (PLEG): container finished" podID="830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" containerID="3219156d1cb508b7a221950fb3a4ddcd6315bd157d0a575855b6542b7644cb46" exitCode=0 Mar 18 18:02:04 crc kubenswrapper[4939]: I0318 18:02:04.015066 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564282-5tzll" event={"ID":"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d","Type":"ContainerDied","Data":"3219156d1cb508b7a221950fb3a4ddcd6315bd157d0a575855b6542b7644cb46"} Mar 18 18:02:05 crc kubenswrapper[4939]: I0318 18:02:05.505153 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:05 crc kubenswrapper[4939]: I0318 18:02:05.594043 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5jc8\" (UniqueName: \"kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8\") pod \"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d\" (UID: \"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d\") " Mar 18 18:02:05 crc kubenswrapper[4939]: I0318 18:02:05.602184 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8" (OuterVolumeSpecName: "kube-api-access-x5jc8") pod "830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" (UID: "830ccc0c-82b4-43a8-b3f6-39c17cf92d0d"). InnerVolumeSpecName "kube-api-access-x5jc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:05 crc kubenswrapper[4939]: I0318 18:02:05.696096 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5jc8\" (UniqueName: \"kubernetes.io/projected/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d-kube-api-access-x5jc8\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.042828 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564282-5tzll" event={"ID":"830ccc0c-82b4-43a8-b3f6-39c17cf92d0d","Type":"ContainerDied","Data":"1093e5888e9da13ebf8e17b74a588d6135912168827edf194ed3b417a1ec92ce"} Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.042893 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1093e5888e9da13ebf8e17b74a588d6135912168827edf194ed3b417a1ec92ce" Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.042930 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564282-5tzll" Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.106164 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564276-8vrs5"] Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.120242 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564276-8vrs5"] Mar 18 18:02:06 crc kubenswrapper[4939]: I0318 18:02:06.150630 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc96d74b-b0c3-4437-af3c-c030e9a79b15" path="/var/lib/kubelet/pods/dc96d74b-b0c3-4437-af3c-c030e9a79b15/volumes" Mar 18 18:02:12 crc kubenswrapper[4939]: I0318 18:02:12.578009 4939 scope.go:117] "RemoveContainer" containerID="4efde799316df0183894cb81489b1a2e142c6929c78811f7a0bd6f617e393b73" Mar 18 18:02:16 crc kubenswrapper[4939]: I0318 18:02:16.151417 4939 generic.go:334] "Generic (PLEG): container finished" podID="cbafcc37-e8cb-4cb5-96aa-00a063f4c003" containerID="523af0ab368525b53b71f18d681cabafdc7a777c7616789debdf1b200ea743e6" exitCode=0 Mar 18 18:02:16 crc kubenswrapper[4939]: I0318 18:02:16.151561 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" event={"ID":"cbafcc37-e8cb-4cb5-96aa-00a063f4c003","Type":"ContainerDied","Data":"523af0ab368525b53b71f18d681cabafdc7a777c7616789debdf1b200ea743e6"} Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.700918 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789179 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789225 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789271 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789355 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789407 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpg9s\" (UniqueName: \"kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.789483 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph\") pod \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\" (UID: \"cbafcc37-e8cb-4cb5-96aa-00a063f4c003\") " Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.796667 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.798902 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph" (OuterVolumeSpecName: "ceph") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.809847 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s" (OuterVolumeSpecName: "kube-api-access-hpg9s") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "kube-api-access-hpg9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.834072 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.835624 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory" (OuterVolumeSpecName: "inventory") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.850038 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "cbafcc37-e8cb-4cb5-96aa-00a063f4c003" (UID: "cbafcc37-e8cb-4cb5-96aa-00a063f4c003"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892296 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892329 4939 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892340 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892350 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892360 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpg9s\" (UniqueName: \"kubernetes.io/projected/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-kube-api-access-hpg9s\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:17 crc kubenswrapper[4939]: I0318 18:02:17.892371 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbafcc37-e8cb-4cb5-96aa-00a063f4c003-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:18 crc kubenswrapper[4939]: I0318 18:02:18.178028 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" event={"ID":"cbafcc37-e8cb-4cb5-96aa-00a063f4c003","Type":"ContainerDied","Data":"02bc126b998f2551e7be1020d155021ced83ae2c772dc39d9ab62f5b6672e857"} Mar 18 18:02:18 crc kubenswrapper[4939]: I0318 18:02:18.178081 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02bc126b998f2551e7be1020d155021ced83ae2c772dc39d9ab62f5b6672e857" Mar 18 18:02:18 crc kubenswrapper[4939]: I0318 18:02:18.178156 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-2lrx4" Mar 18 18:02:44 crc kubenswrapper[4939]: I0318 18:02:44.943194 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:44 crc kubenswrapper[4939]: I0318 18:02:44.943832 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" containerName="nova-cell0-conductor-conductor" containerID="cri-o://60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf" gracePeriod=30 Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.028171 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.028426 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ae755085-923a-4216-8b62-9ec03762749e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" gracePeriod=30 Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.633688 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9"] Mar 18 18:02:45 crc kubenswrapper[4939]: E0318 18:02:45.634174 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" containerName="oc" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.634192 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" containerName="oc" Mar 18 18:02:45 crc kubenswrapper[4939]: E0318 18:02:45.634242 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbafcc37-e8cb-4cb5-96aa-00a063f4c003" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.634249 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbafcc37-e8cb-4cb5-96aa-00a063f4c003" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.634447 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" containerName="oc" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.634473 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbafcc37-e8cb-4cb5-96aa-00a063f4c003" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.635299 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.638371 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-hrcsr" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.638582 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.638889 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.638929 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.639076 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.639192 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.639197 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.651256 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9"] Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.697693 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.697840 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.697929 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.697990 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698073 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698168 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698262 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698328 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698562 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698639 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwnnh\" (UniqueName: \"kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698899 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.698991 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.800571 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.800982 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801013 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801047 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801081 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801130 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801154 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801233 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801264 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwnnh\" (UniqueName: \"kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801316 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801364 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801417 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.801609 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.802614 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.803084 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.809998 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.810395 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.811578 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.811817 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.813904 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.818918 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.819009 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.821052 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.821396 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwnnh\" (UniqueName: \"kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.822230 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.825267 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:45 crc kubenswrapper[4939]: I0318 18:02:45.958517 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.395172 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.395628 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-log" containerID="cri-o://4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad" gracePeriod=30 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.396098 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-api" containerID="cri-o://3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451" gracePeriod=30 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.435163 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.500700 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.501272 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-log" containerID="cri-o://f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e" gracePeriod=30 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.501888 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-metadata" containerID="cri-o://d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397" gracePeriod=30 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.610398 4939 generic.go:334] "Generic (PLEG): container finished" podID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerID="4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad" exitCode=143 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.610617 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" containerID="cri-o://d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" gracePeriod=30 Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.610942 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerDied","Data":"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad"} Mar 18 18:02:46 crc kubenswrapper[4939]: I0318 18:02:46.689145 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.201932 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.204365 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.206011 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.206101 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.275992 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.278049 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.281360 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.281402 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ae755085-923a-4216-8b62-9ec03762749e" containerName="nova-cell1-conductor-conductor" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.579783 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.621024 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" event={"ID":"84b96943-32ca-40b7-8139-50e7c64835eb","Type":"ContainerStarted","Data":"cca97bf8036082c16b9005062730cdfd59915d6e526d06219770180c7b772f5b"} Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.623018 4939 generic.go:334] "Generic (PLEG): container finished" podID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerID="f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e" exitCode=143 Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.623091 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerDied","Data":"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e"} Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.625702 4939 generic.go:334] "Generic (PLEG): container finished" podID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" containerID="60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf" exitCode=0 Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.625756 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a5c5d50-0a4d-4b21-a02e-753b33c8a592","Type":"ContainerDied","Data":"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf"} Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.625792 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a5c5d50-0a4d-4b21-a02e-753b33c8a592","Type":"ContainerDied","Data":"2140b85215d74bc94726ac6a50693bc975e66570f9ab47c001670f5eae33f9e5"} Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.625812 4939 scope.go:117] "RemoveContainer" containerID="60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.625867 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.644333 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data\") pod \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.644557 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle\") pod \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.644665 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xwz\" (UniqueName: \"kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz\") pod \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\" (UID: \"1a5c5d50-0a4d-4b21-a02e-753b33c8a592\") " Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.653791 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz" (OuterVolumeSpecName: "kube-api-access-l6xwz") pod "1a5c5d50-0a4d-4b21-a02e-753b33c8a592" (UID: "1a5c5d50-0a4d-4b21-a02e-753b33c8a592"). InnerVolumeSpecName "kube-api-access-l6xwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.668939 4939 scope.go:117] "RemoveContainer" containerID="60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf" Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.671648 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf\": container with ID starting with 60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf not found: ID does not exist" containerID="60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.671716 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf"} err="failed to get container status \"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf\": rpc error: code = NotFound desc = could not find container \"60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf\": container with ID starting with 60cd7ab90f729fd4106289662146f9e66b4c1b41929a476ff97f384efe631bdf not found: ID does not exist" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.688276 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a5c5d50-0a4d-4b21-a02e-753b33c8a592" (UID: "1a5c5d50-0a4d-4b21-a02e-753b33c8a592"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.716649 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data" (OuterVolumeSpecName: "config-data") pod "1a5c5d50-0a4d-4b21-a02e-753b33c8a592" (UID: "1a5c5d50-0a4d-4b21-a02e-753b33c8a592"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.747450 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.747498 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.747532 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xwz\" (UniqueName: \"kubernetes.io/projected/1a5c5d50-0a4d-4b21-a02e-753b33c8a592-kube-api-access-l6xwz\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.963639 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.978559 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.996850 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:47 crc kubenswrapper[4939]: E0318 18:02:47.997940 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" containerName="nova-cell0-conductor-conductor" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.997961 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" containerName="nova-cell0-conductor-conductor" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.998713 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" containerName="nova-cell0-conductor-conductor" Mar 18 18:02:47 crc kubenswrapper[4939]: I0318 18:02:47.999728 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.007860 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.015657 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.056305 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.057375 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.057520 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dfk\" (UniqueName: \"kubernetes.io/projected/19e1a5e8-cac2-4f80-ad14-f4506492d912-kube-api-access-79dfk\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.159756 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79dfk\" (UniqueName: \"kubernetes.io/projected/19e1a5e8-cac2-4f80-ad14-f4506492d912-kube-api-access-79dfk\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.160061 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.160160 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.161994 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5c5d50-0a4d-4b21-a02e-753b33c8a592" path="/var/lib/kubelet/pods/1a5c5d50-0a4d-4b21-a02e-753b33c8a592/volumes" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.164362 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.173383 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a5e8-cac2-4f80-ad14-f4506492d912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.176206 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dfk\" (UniqueName: \"kubernetes.io/projected/19e1a5e8-cac2-4f80-ad14-f4506492d912-kube-api-access-79dfk\") pod \"nova-cell0-conductor-0\" (UID: \"19e1a5e8-cac2-4f80-ad14-f4506492d912\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.335214 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.641811 4939 generic.go:334] "Generic (PLEG): container finished" podID="ae755085-923a-4216-8b62-9ec03762749e" containerID="4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" exitCode=0 Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.641908 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ae755085-923a-4216-8b62-9ec03762749e","Type":"ContainerDied","Data":"4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99"} Mar 18 18:02:48 crc kubenswrapper[4939]: I0318 18:02:48.841234 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.026658 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.081546 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data\") pod \"ae755085-923a-4216-8b62-9ec03762749e\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.081646 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9rvf\" (UniqueName: \"kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf\") pod \"ae755085-923a-4216-8b62-9ec03762749e\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.081677 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle\") pod \"ae755085-923a-4216-8b62-9ec03762749e\" (UID: \"ae755085-923a-4216-8b62-9ec03762749e\") " Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.088673 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf" (OuterVolumeSpecName: "kube-api-access-n9rvf") pod "ae755085-923a-4216-8b62-9ec03762749e" (UID: "ae755085-923a-4216-8b62-9ec03762749e"). InnerVolumeSpecName "kube-api-access-n9rvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.120642 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data" (OuterVolumeSpecName: "config-data") pod "ae755085-923a-4216-8b62-9ec03762749e" (UID: "ae755085-923a-4216-8b62-9ec03762749e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.123803 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae755085-923a-4216-8b62-9ec03762749e" (UID: "ae755085-923a-4216-8b62-9ec03762749e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.185032 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.185563 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9rvf\" (UniqueName: \"kubernetes.io/projected/ae755085-923a-4216-8b62-9ec03762749e-kube-api-access-n9rvf\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.185649 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae755085-923a-4216-8b62-9ec03762749e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.668996 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ae755085-923a-4216-8b62-9ec03762749e","Type":"ContainerDied","Data":"91459c8f432d64649b3f3ed3fcf805371610d6fbf1600ea8c404fb1ead35ac92"} Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.669061 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.669082 4939 scope.go:117] "RemoveContainer" containerID="4d6de8dcf925f0c2d93f7978880139513afd9476a8101aa8b12c11467b1dbb99" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.671652 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19e1a5e8-cac2-4f80-ad14-f4506492d912","Type":"ContainerStarted","Data":"afff0431d87e99c383b4e66776dab94d5048fa9a5deba881b9df2eb401fc6bd0"} Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.671706 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19e1a5e8-cac2-4f80-ad14-f4506492d912","Type":"ContainerStarted","Data":"dfb54288a3c0ef947d92ecd71301688569ed2a726ac27759cf34a3bd514e439b"} Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.671816 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.725545 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7254880200000002 podStartE2EDuration="2.72548802s" podCreationTimestamp="2026-03-18 18:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:02:49.70578141 +0000 UTC m=+8734.304969071" watchObservedRunningTime="2026-03-18 18:02:49.72548802 +0000 UTC m=+8734.324675681" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.753453 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.772889 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.785550 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:49 crc kubenswrapper[4939]: E0318 18:02:49.786062 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae755085-923a-4216-8b62-9ec03762749e" containerName="nova-cell1-conductor-conductor" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.786084 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae755085-923a-4216-8b62-9ec03762749e" containerName="nova-cell1-conductor-conductor" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.786372 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae755085-923a-4216-8b62-9ec03762749e" containerName="nova-cell1-conductor-conductor" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.787187 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.789439 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.795607 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.900786 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.900970 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qdg\" (UniqueName: \"kubernetes.io/projected/350a8616-a19e-465c-a58c-489b897b6bf5-kube-api-access-58qdg\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:49 crc kubenswrapper[4939]: I0318 18:02:49.901064 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.003059 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qdg\" (UniqueName: \"kubernetes.io/projected/350a8616-a19e-465c-a58c-489b897b6bf5-kube-api-access-58qdg\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.003146 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.003226 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.011010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.011010 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350a8616-a19e-465c-a58c-489b897b6bf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.027825 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qdg\" (UniqueName: \"kubernetes.io/projected/350a8616-a19e-465c-a58c-489b897b6bf5-kube-api-access-58qdg\") pod \"nova-cell1-conductor-0\" (UID: \"350a8616-a19e-465c-a58c-489b897b6bf5\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.108683 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.149530 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae755085-923a-4216-8b62-9ec03762749e" path="/var/lib/kubelet/pods/ae755085-923a-4216-8b62-9ec03762749e/volumes" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.434056 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.505454 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.513000 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle\") pod \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.513200 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq2vk\" (UniqueName: \"kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk\") pod \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.513397 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data\") pod \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.513436 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs\") pod \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\" (UID: \"1541c8af-014c-447e-9fcc-ba6bfc0e8597\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.517186 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs" (OuterVolumeSpecName: "logs") pod "1541c8af-014c-447e-9fcc-ba6bfc0e8597" (UID: "1541c8af-014c-447e-9fcc-ba6bfc0e8597"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.519075 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk" (OuterVolumeSpecName: "kube-api-access-cq2vk") pod "1541c8af-014c-447e-9fcc-ba6bfc0e8597" (UID: "1541c8af-014c-447e-9fcc-ba6bfc0e8597"). InnerVolumeSpecName "kube-api-access-cq2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.565863 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1541c8af-014c-447e-9fcc-ba6bfc0e8597" (UID: "1541c8af-014c-447e-9fcc-ba6bfc0e8597"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.566732 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data" (OuterVolumeSpecName: "config-data") pod "1541c8af-014c-447e-9fcc-ba6bfc0e8597" (UID: "1541c8af-014c-447e-9fcc-ba6bfc0e8597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.615668 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjh4h\" (UniqueName: \"kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h\") pod \"7adbcc7d-84f9-47ff-85a2-ec56be833188\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.615783 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle\") pod \"7adbcc7d-84f9-47ff-85a2-ec56be833188\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.615849 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs\") pod \"7adbcc7d-84f9-47ff-85a2-ec56be833188\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.615900 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data\") pod \"7adbcc7d-84f9-47ff-85a2-ec56be833188\" (UID: \"7adbcc7d-84f9-47ff-85a2-ec56be833188\") " Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.616429 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq2vk\" (UniqueName: \"kubernetes.io/projected/1541c8af-014c-447e-9fcc-ba6bfc0e8597-kube-api-access-cq2vk\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.616450 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.616461 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1541c8af-014c-447e-9fcc-ba6bfc0e8597-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.616475 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1541c8af-014c-447e-9fcc-ba6bfc0e8597-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.624013 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs" (OuterVolumeSpecName: "logs") pod "7adbcc7d-84f9-47ff-85a2-ec56be833188" (UID: "7adbcc7d-84f9-47ff-85a2-ec56be833188"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.630005 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h" (OuterVolumeSpecName: "kube-api-access-pjh4h") pod "7adbcc7d-84f9-47ff-85a2-ec56be833188" (UID: "7adbcc7d-84f9-47ff-85a2-ec56be833188"). InnerVolumeSpecName "kube-api-access-pjh4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.647908 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7adbcc7d-84f9-47ff-85a2-ec56be833188" (UID: "7adbcc7d-84f9-47ff-85a2-ec56be833188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.651842 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data" (OuterVolumeSpecName: "config-data") pod "7adbcc7d-84f9-47ff-85a2-ec56be833188" (UID: "7adbcc7d-84f9-47ff-85a2-ec56be833188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.682767 4939 generic.go:334] "Generic (PLEG): container finished" podID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerID="3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451" exitCode=0 Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.682825 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerDied","Data":"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451"} Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.682850 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1541c8af-014c-447e-9fcc-ba6bfc0e8597","Type":"ContainerDied","Data":"460894799a4ee49e0ab7856f7975a2136fe9ce175e90e2589093705a1270e906"} Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.682867 4939 scope.go:117] "RemoveContainer" containerID="3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.682960 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.692186 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" event={"ID":"84b96943-32ca-40b7-8139-50e7c64835eb","Type":"ContainerStarted","Data":"4841f54b7e54ef8c3dfd0a5c58525bd2a7556ebe176cf4f0e3797ded21426609"} Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.695963 4939 generic.go:334] "Generic (PLEG): container finished" podID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerID="d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397" exitCode=0 Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.697145 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.698006 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerDied","Data":"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397"} Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.698067 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7adbcc7d-84f9-47ff-85a2-ec56be833188","Type":"ContainerDied","Data":"dc91d16ed7b1142ce2dddf3168f1155e379a527c5bd95b2903a83017b9843dd4"} Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.718111 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" podStartSLOduration=2.594049526 podStartE2EDuration="5.718090574s" podCreationTimestamp="2026-03-18 18:02:45 +0000 UTC" firstStartedPulling="2026-03-18 18:02:46.694440176 +0000 UTC m=+8731.293627797" lastFinishedPulling="2026-03-18 18:02:49.818481234 +0000 UTC m=+8734.417668845" observedRunningTime="2026-03-18 18:02:50.715578092 +0000 UTC m=+8735.314765713" watchObservedRunningTime="2026-03-18 18:02:50.718090574 +0000 UTC m=+8735.317278195" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.724875 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjh4h\" (UniqueName: \"kubernetes.io/projected/7adbcc7d-84f9-47ff-85a2-ec56be833188-kube-api-access-pjh4h\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.730545 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.730562 4939 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7adbcc7d-84f9-47ff-85a2-ec56be833188-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.730571 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7adbcc7d-84f9-47ff-85a2-ec56be833188-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.759695 4939 scope.go:117] "RemoveContainer" containerID="4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.767543 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.783667 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.796423 4939 scope.go:117] "RemoveContainer" containerID="3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451" Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.800064 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451\": container with ID starting with 3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451 not found: ID does not exist" containerID="3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.800102 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451"} err="failed to get container status \"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451\": rpc error: code = NotFound desc = could not find container \"3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451\": container with ID starting with 3d5347f469ac400aeba5d1c9eabbc2c0a735df2c090f2bd48dfa07ea23844451 not found: ID does not exist" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.800122 4939 scope.go:117] "RemoveContainer" containerID="4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad" Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.801968 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad\": container with ID starting with 4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad not found: ID does not exist" containerID="4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.802004 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad"} err="failed to get container status \"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad\": rpc error: code = NotFound desc = could not find container \"4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad\": container with ID starting with 4e734bf75255f742e355643a4aa46eb928203381f771ae3c612e0344a8b01fad not found: ID does not exist" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.802021 4939 scope.go:117] "RemoveContainer" containerID="d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.819890 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.820457 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-metadata" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820475 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-metadata" Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.820514 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-log" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820524 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-log" Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.820552 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-api" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820559 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-api" Mar 18 18:02:50 crc kubenswrapper[4939]: E0318 18:02:50.820576 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-log" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820583 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-log" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820803 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-log" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820817 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-log" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820830 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" containerName="nova-api-api" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.820843 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" containerName="nova-metadata-metadata" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.822101 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.824859 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.902763 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.935373 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.935453 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241b524a-2440-403d-ad85-8060e5df7c74-logs\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.935531 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-config-data\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.935702 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhkkr\" (UniqueName: \"kubernetes.io/projected/241b524a-2440-403d-ad85-8060e5df7c74-kube-api-access-zhkkr\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.942139 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.957890 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.972313 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.992220 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.997410 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:02:50 crc kubenswrapper[4939]: I0318 18:02:50.999024 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.001761 4939 scope.go:117] "RemoveContainer" containerID="f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.013371 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.029846 4939 scope.go:117] "RemoveContainer" containerID="d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397" Mar 18 18:02:51 crc kubenswrapper[4939]: E0318 18:02:51.030335 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397\": container with ID starting with d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397 not found: ID does not exist" containerID="d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.030478 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397"} err="failed to get container status \"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397\": rpc error: code = NotFound desc = could not find container \"d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397\": container with ID starting with d2e03c5889a1e44670bfdd18ac4d8b8cdf2f22b8697793985ff999d6f7d4b397 not found: ID does not exist" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.030606 4939 scope.go:117] "RemoveContainer" containerID="f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e" Mar 18 18:02:51 crc kubenswrapper[4939]: E0318 18:02:51.031016 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e\": container with ID starting with f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e not found: ID does not exist" containerID="f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.031056 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e"} err="failed to get container status \"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e\": rpc error: code = NotFound desc = could not find container \"f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e\": container with ID starting with f31c9b53b42e052e6c08518e1e8b6d99b79058fe7f34179cc25f20be78f2b67e not found: ID does not exist" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.038272 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dm6\" (UniqueName: \"kubernetes.io/projected/ac312cd1-c869-4b32-834f-a18a634531e9-kube-api-access-86dm6\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.038449 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhkkr\" (UniqueName: \"kubernetes.io/projected/241b524a-2440-403d-ad85-8060e5df7c74-kube-api-access-zhkkr\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.038591 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.038702 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-config-data\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.038831 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac312cd1-c869-4b32-834f-a18a634531e9-logs\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.039064 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.039192 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241b524a-2440-403d-ad85-8060e5df7c74-logs\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.039308 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-config-data\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.039611 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241b524a-2440-403d-ad85-8060e5df7c74-logs\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.044111 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.052041 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241b524a-2440-403d-ad85-8060e5df7c74-config-data\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.062564 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhkkr\" (UniqueName: \"kubernetes.io/projected/241b524a-2440-403d-ad85-8060e5df7c74-kube-api-access-zhkkr\") pod \"nova-api-0\" (UID: \"241b524a-2440-403d-ad85-8060e5df7c74\") " pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.140677 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86dm6\" (UniqueName: \"kubernetes.io/projected/ac312cd1-c869-4b32-834f-a18a634531e9-kube-api-access-86dm6\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.141314 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.148022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-config-data\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.148361 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac312cd1-c869-4b32-834f-a18a634531e9-logs\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.148805 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac312cd1-c869-4b32-834f-a18a634531e9-logs\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.149326 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-config-data\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.154107 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac312cd1-c869-4b32-834f-a18a634531e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.157334 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dm6\" (UniqueName: \"kubernetes.io/projected/ac312cd1-c869-4b32-834f-a18a634531e9-kube-api-access-86dm6\") pod \"nova-metadata-0\" (UID: \"ac312cd1-c869-4b32-834f-a18a634531e9\") " pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.162651 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.327073 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.654454 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:02:51 crc kubenswrapper[4939]: W0318 18:02:51.664207 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac312cd1_c869_4b32_834f_a18a634531e9.slice/crio-a73baf8adc5666d8582402878b60308fde01c4e6511d7171f8400ef9bf1882d8 WatchSource:0}: Error finding container a73baf8adc5666d8582402878b60308fde01c4e6511d7171f8400ef9bf1882d8: Status 404 returned error can't find the container with id a73baf8adc5666d8582402878b60308fde01c4e6511d7171f8400ef9bf1882d8 Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.712293 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"350a8616-a19e-465c-a58c-489b897b6bf5","Type":"ContainerStarted","Data":"47ca0b0bba13f0628598299c8bd05fa4788847bb7b539352513c1115acb01b29"} Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.712328 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"350a8616-a19e-465c-a58c-489b897b6bf5","Type":"ContainerStarted","Data":"091c09c6a4de33d5861085fec706560b5fc9be7572fa32415fa82938144e9491"} Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.713809 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.716225 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac312cd1-c869-4b32-834f-a18a634531e9","Type":"ContainerStarted","Data":"a73baf8adc5666d8582402878b60308fde01c4e6511d7171f8400ef9bf1882d8"} Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.743957 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.743937652 podStartE2EDuration="2.743937652s" podCreationTimestamp="2026-03-18 18:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:02:51.732759974 +0000 UTC m=+8736.331947585" watchObservedRunningTime="2026-03-18 18:02:51.743937652 +0000 UTC m=+8736.343125273" Mar 18 18:02:51 crc kubenswrapper[4939]: I0318 18:02:51.817104 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:02:51 crc kubenswrapper[4939]: W0318 18:02:51.820753 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241b524a_2440_403d_ad85_8060e5df7c74.slice/crio-1ae8d81fdcd9bdf5c06b24345ba841a69e2b2244c123e12417518f22d47b658b WatchSource:0}: Error finding container 1ae8d81fdcd9bdf5c06b24345ba841a69e2b2244c123e12417518f22d47b658b: Status 404 returned error can't find the container with id 1ae8d81fdcd9bdf5c06b24345ba841a69e2b2244c123e12417518f22d47b658b Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.154976 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1541c8af-014c-447e-9fcc-ba6bfc0e8597" path="/var/lib/kubelet/pods/1541c8af-014c-447e-9fcc-ba6bfc0e8597/volumes" Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.157906 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7adbcc7d-84f9-47ff-85a2-ec56be833188" path="/var/lib/kubelet/pods/7adbcc7d-84f9-47ff-85a2-ec56be833188/volumes" Mar 18 18:02:52 crc kubenswrapper[4939]: E0318 18:02:52.201157 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:52 crc kubenswrapper[4939]: E0318 18:02:52.202897 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:52 crc kubenswrapper[4939]: E0318 18:02:52.203919 4939 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:02:52 crc kubenswrapper[4939]: E0318 18:02:52.203957 4939 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.730554 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac312cd1-c869-4b32-834f-a18a634531e9","Type":"ContainerStarted","Data":"0125f540022a41d15cf91a98fb11171839b4b97c2c7e200d113b8342fdc999a4"} Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.730838 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac312cd1-c869-4b32-834f-a18a634531e9","Type":"ContainerStarted","Data":"13473e124d73ea2774f4244b202b59cc363bcd64d59ee68df663f125b8a66467"} Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.734237 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241b524a-2440-403d-ad85-8060e5df7c74","Type":"ContainerStarted","Data":"8344521d7463c6f4302ade83b6306e68249230d31df1cbe885435979f26d22d7"} Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.734282 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241b524a-2440-403d-ad85-8060e5df7c74","Type":"ContainerStarted","Data":"87779071b68e34c5ea54ab280aeea4d1412372b431193b836b7c3b8500d3b6a4"} Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.734300 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241b524a-2440-403d-ad85-8060e5df7c74","Type":"ContainerStarted","Data":"1ae8d81fdcd9bdf5c06b24345ba841a69e2b2244c123e12417518f22d47b658b"} Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.787235 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.787212456 podStartE2EDuration="2.787212456s" podCreationTimestamp="2026-03-18 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:02:52.756888684 +0000 UTC m=+8737.356076345" watchObservedRunningTime="2026-03-18 18:02:52.787212456 +0000 UTC m=+8737.386400087" Mar 18 18:02:52 crc kubenswrapper[4939]: I0318 18:02:52.790580 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7905634519999998 podStartE2EDuration="2.790563452s" podCreationTimestamp="2026-03-18 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:02:52.781251327 +0000 UTC m=+8737.380438958" watchObservedRunningTime="2026-03-18 18:02:52.790563452 +0000 UTC m=+8737.389751083" Mar 18 18:02:53 crc kubenswrapper[4939]: I0318 18:02:53.687357 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:02:53 crc kubenswrapper[4939]: I0318 18:02:53.687891 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.782752 4939 generic.go:334] "Generic (PLEG): container finished" podID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" exitCode=0 Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.782848 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9","Type":"ContainerDied","Data":"d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74"} Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.783420 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9","Type":"ContainerDied","Data":"80fb9f982fc60231dd237397c67c03f2edcfdcde88c70823a38db0d25e2c0716"} Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.783452 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80fb9f982fc60231dd237397c67c03f2edcfdcde88c70823a38db0d25e2c0716" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.838816 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.890424 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle\") pod \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.890671 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data\") pod \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.890958 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8nq4\" (UniqueName: \"kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4\") pod \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\" (UID: \"c5cd2a16-79b9-45c8-8310-66fa4bf04ce9\") " Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.901334 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4" (OuterVolumeSpecName: "kube-api-access-z8nq4") pod "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" (UID: "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9"). InnerVolumeSpecName "kube-api-access-z8nq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.951812 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" (UID: "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.953004 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data" (OuterVolumeSpecName: "config-data") pod "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" (UID: "c5cd2a16-79b9-45c8-8310-66fa4bf04ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.992823 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8nq4\" (UniqueName: \"kubernetes.io/projected/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-kube-api-access-z8nq4\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.992859 4939 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:56 crc kubenswrapper[4939]: I0318 18:02:56.992873 4939 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.797117 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.861735 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.883033 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.894612 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:57 crc kubenswrapper[4939]: E0318 18:02:57.895532 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.895573 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.896094 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" containerName="nova-scheduler-scheduler" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.897713 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.901399 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.916351 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.922060 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.922221 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-config-data\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:57 crc kubenswrapper[4939]: I0318 18:02:57.922301 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97d8\" (UniqueName: \"kubernetes.io/projected/75f57bf0-1b74-4877-81b6-dfcbc355da4d-kube-api-access-x97d8\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.024614 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.024852 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-config-data\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.024949 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97d8\" (UniqueName: \"kubernetes.io/projected/75f57bf0-1b74-4877-81b6-dfcbc355da4d-kube-api-access-x97d8\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.032976 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-config-data\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.032986 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f57bf0-1b74-4877-81b6-dfcbc355da4d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.057636 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97d8\" (UniqueName: \"kubernetes.io/projected/75f57bf0-1b74-4877-81b6-dfcbc355da4d-kube-api-access-x97d8\") pod \"nova-scheduler-0\" (UID: \"75f57bf0-1b74-4877-81b6-dfcbc355da4d\") " pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.156344 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cd2a16-79b9-45c8-8310-66fa4bf04ce9" path="/var/lib/kubelet/pods/c5cd2a16-79b9-45c8-8310-66fa4bf04ce9/volumes" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.217938 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.370228 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 18:02:58 crc kubenswrapper[4939]: W0318 18:02:58.752652 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75f57bf0_1b74_4877_81b6_dfcbc355da4d.slice/crio-92aa5ffa6658080bd2cdbeb99685db3280a85f23cfb4662c3578b884de53b208 WatchSource:0}: Error finding container 92aa5ffa6658080bd2cdbeb99685db3280a85f23cfb4662c3578b884de53b208: Status 404 returned error can't find the container with id 92aa5ffa6658080bd2cdbeb99685db3280a85f23cfb4662c3578b884de53b208 Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.773917 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:02:58 crc kubenswrapper[4939]: I0318 18:02:58.811681 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75f57bf0-1b74-4877-81b6-dfcbc355da4d","Type":"ContainerStarted","Data":"92aa5ffa6658080bd2cdbeb99685db3280a85f23cfb4662c3578b884de53b208"} Mar 18 18:02:59 crc kubenswrapper[4939]: I0318 18:02:59.825790 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75f57bf0-1b74-4877-81b6-dfcbc355da4d","Type":"ContainerStarted","Data":"20789071a1b8966ded92159c4b088fab53269c076a2c93a08b842e7b288fd00a"} Mar 18 18:02:59 crc kubenswrapper[4939]: I0318 18:02:59.857400 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.857376038 podStartE2EDuration="2.857376038s" podCreationTimestamp="2026-03-18 18:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:02:59.849914126 +0000 UTC m=+8744.449101777" watchObservedRunningTime="2026-03-18 18:02:59.857376038 +0000 UTC m=+8744.456563679" Mar 18 18:03:00 crc kubenswrapper[4939]: I0318 18:03:00.174789 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 18:03:01 crc kubenswrapper[4939]: I0318 18:03:01.163208 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:03:01 crc kubenswrapper[4939]: I0318 18:03:01.163678 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:03:01 crc kubenswrapper[4939]: I0318 18:03:01.327577 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:03:01 crc kubenswrapper[4939]: I0318 18:03:01.327644 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:03:02 crc kubenswrapper[4939]: I0318 18:03:02.247689 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac312cd1-c869-4b32-834f-a18a634531e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.22:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:02 crc kubenswrapper[4939]: I0318 18:03:02.247730 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac312cd1-c869-4b32-834f-a18a634531e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.22:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:02 crc kubenswrapper[4939]: I0318 18:03:02.409917 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241b524a-2440-403d-ad85-8060e5df7c74" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.13:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:02 crc kubenswrapper[4939]: I0318 18:03:02.409927 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241b524a-2440-403d-ad85-8060e5df7c74" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.13:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:03 crc kubenswrapper[4939]: I0318 18:03:03.218431 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:03:08 crc kubenswrapper[4939]: I0318 18:03:08.219012 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:03:08 crc kubenswrapper[4939]: I0318 18:03:08.268795 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:03:08 crc kubenswrapper[4939]: I0318 18:03:08.966688 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:03:09 crc kubenswrapper[4939]: I0318 18:03:09.163922 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:03:09 crc kubenswrapper[4939]: I0318 18:03:09.163997 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:03:09 crc kubenswrapper[4939]: I0318 18:03:09.327719 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:03:09 crc kubenswrapper[4939]: I0318 18:03:09.327763 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.166908 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.167466 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.174064 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.174140 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.332133 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.334594 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.347796 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:03:11 crc kubenswrapper[4939]: I0318 18:03:11.976491 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:03:12 crc kubenswrapper[4939]: I0318 18:03:12.687941 4939 scope.go:117] "RemoveContainer" containerID="d195d664d35a35bf4dad7a6d83b45bb4ef2f4d736f451df939acfee5d0bc0c74" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.316838 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c79c8"] Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.323579 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.355330 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c79c8"] Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.447137 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-utilities\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.447244 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-catalog-content\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.447357 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlt8z\" (UniqueName: \"kubernetes.io/projected/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-kube-api-access-wlt8z\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.550127 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlt8z\" (UniqueName: \"kubernetes.io/projected/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-kube-api-access-wlt8z\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.550320 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-utilities\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.550461 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-catalog-content\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.551037 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-catalog-content\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.551334 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-utilities\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.585279 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlt8z\" (UniqueName: \"kubernetes.io/projected/66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f-kube-api-access-wlt8z\") pod \"redhat-operators-c79c8\" (UID: \"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f\") " pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:20 crc kubenswrapper[4939]: I0318 18:03:20.658944 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:21 crc kubenswrapper[4939]: I0318 18:03:21.154944 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c79c8"] Mar 18 18:03:21 crc kubenswrapper[4939]: W0318 18:03:21.160084 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66fe6e33_3cf9_44d8_afcc_d9b6497e0d5f.slice/crio-4cedd64a7bde0e1a4e6322897bb1fb9afcc9328f5251eddb258746ff33de5861 WatchSource:0}: Error finding container 4cedd64a7bde0e1a4e6322897bb1fb9afcc9328f5251eddb258746ff33de5861: Status 404 returned error can't find the container with id 4cedd64a7bde0e1a4e6322897bb1fb9afcc9328f5251eddb258746ff33de5861 Mar 18 18:03:22 crc kubenswrapper[4939]: I0318 18:03:22.081744 4939 generic.go:334] "Generic (PLEG): container finished" podID="66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f" containerID="a18c3ba9ee617630da58b6e794b490292543cb35fb4a8a69e67e8fe0773174ac" exitCode=0 Mar 18 18:03:22 crc kubenswrapper[4939]: I0318 18:03:22.081852 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c79c8" event={"ID":"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f","Type":"ContainerDied","Data":"a18c3ba9ee617630da58b6e794b490292543cb35fb4a8a69e67e8fe0773174ac"} Mar 18 18:03:22 crc kubenswrapper[4939]: I0318 18:03:22.082139 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c79c8" event={"ID":"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f","Type":"ContainerStarted","Data":"4cedd64a7bde0e1a4e6322897bb1fb9afcc9328f5251eddb258746ff33de5861"} Mar 18 18:03:23 crc kubenswrapper[4939]: I0318 18:03:23.688154 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:03:23 crc kubenswrapper[4939]: I0318 18:03:23.689229 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:03:35 crc kubenswrapper[4939]: I0318 18:03:35.234280 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c79c8" event={"ID":"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f","Type":"ContainerStarted","Data":"b517d4db4986b036ff0b3f8479bdadabf577c576331dd87b7e61e97fca41c287"} Mar 18 18:03:36 crc kubenswrapper[4939]: I0318 18:03:36.260037 4939 generic.go:334] "Generic (PLEG): container finished" podID="66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f" containerID="b517d4db4986b036ff0b3f8479bdadabf577c576331dd87b7e61e97fca41c287" exitCode=0 Mar 18 18:03:36 crc kubenswrapper[4939]: I0318 18:03:36.260154 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c79c8" event={"ID":"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f","Type":"ContainerDied","Data":"b517d4db4986b036ff0b3f8479bdadabf577c576331dd87b7e61e97fca41c287"} Mar 18 18:03:37 crc kubenswrapper[4939]: I0318 18:03:37.277084 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c79c8" event={"ID":"66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f","Type":"ContainerStarted","Data":"9d0d38c90a2ee5ba3dda2fdbd37be55ed5d813fc69e8c4eb0f23b1e82bd3446b"} Mar 18 18:03:37 crc kubenswrapper[4939]: I0318 18:03:37.306242 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c79c8" podStartSLOduration=2.416530672 podStartE2EDuration="17.306223371s" podCreationTimestamp="2026-03-18 18:03:20 +0000 UTC" firstStartedPulling="2026-03-18 18:03:22.084919443 +0000 UTC m=+8766.684107064" lastFinishedPulling="2026-03-18 18:03:36.974612112 +0000 UTC m=+8781.573799763" observedRunningTime="2026-03-18 18:03:37.300632952 +0000 UTC m=+8781.899820663" watchObservedRunningTime="2026-03-18 18:03:37.306223371 +0000 UTC m=+8781.905410992" Mar 18 18:03:40 crc kubenswrapper[4939]: I0318 18:03:40.661799 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:40 crc kubenswrapper[4939]: I0318 18:03:40.662861 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:41 crc kubenswrapper[4939]: I0318 18:03:41.720806 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c79c8" podUID="66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f" containerName="registry-server" probeResult="failure" output=< Mar 18 18:03:41 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:03:41 crc kubenswrapper[4939]: > Mar 18 18:03:50 crc kubenswrapper[4939]: I0318 18:03:50.732833 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:50 crc kubenswrapper[4939]: I0318 18:03:50.825238 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c79c8" Mar 18 18:03:51 crc kubenswrapper[4939]: I0318 18:03:51.345777 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c79c8"] Mar 18 18:03:51 crc kubenswrapper[4939]: I0318 18:03:51.528756 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 18:03:51 crc kubenswrapper[4939]: I0318 18:03:51.529080 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-krc58" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="registry-server" containerID="cri-o://10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9" gracePeriod=2 Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.084736 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.249708 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m975z\" (UniqueName: \"kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z\") pod \"4d966f61-7e4c-486e-923e-0afceff94e7a\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.250257 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content\") pod \"4d966f61-7e4c-486e-923e-0afceff94e7a\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.250338 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities\") pod \"4d966f61-7e4c-486e-923e-0afceff94e7a\" (UID: \"4d966f61-7e4c-486e-923e-0afceff94e7a\") " Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.251260 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities" (OuterVolumeSpecName: "utilities") pod "4d966f61-7e4c-486e-923e-0afceff94e7a" (UID: "4d966f61-7e4c-486e-923e-0afceff94e7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.262475 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z" (OuterVolumeSpecName: "kube-api-access-m975z") pod "4d966f61-7e4c-486e-923e-0afceff94e7a" (UID: "4d966f61-7e4c-486e-923e-0afceff94e7a"). InnerVolumeSpecName "kube-api-access-m975z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.353669 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m975z\" (UniqueName: \"kubernetes.io/projected/4d966f61-7e4c-486e-923e-0afceff94e7a-kube-api-access-m975z\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.353703 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.368235 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d966f61-7e4c-486e-923e-0afceff94e7a" (UID: "4d966f61-7e4c-486e-923e-0afceff94e7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.452492 4939 generic.go:334] "Generic (PLEG): container finished" podID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerID="10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9" exitCode=0 Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.452585 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerDied","Data":"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9"} Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.452633 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-krc58" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.452667 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-krc58" event={"ID":"4d966f61-7e4c-486e-923e-0afceff94e7a","Type":"ContainerDied","Data":"1721f863992c2d871bd16fa86374d542c41a2109ce33908f3eacae5be62aef91"} Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.452702 4939 scope.go:117] "RemoveContainer" containerID="10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.456021 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d966f61-7e4c-486e-923e-0afceff94e7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.475162 4939 scope.go:117] "RemoveContainer" containerID="030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.493598 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.500284 4939 scope.go:117] "RemoveContainer" containerID="d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.515342 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-krc58"] Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.547352 4939 scope.go:117] "RemoveContainer" containerID="10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9" Mar 18 18:03:52 crc kubenswrapper[4939]: E0318 18:03:52.547744 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9\": container with ID starting with 10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9 not found: ID does not exist" containerID="10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.547771 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9"} err="failed to get container status \"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9\": rpc error: code = NotFound desc = could not find container \"10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9\": container with ID starting with 10cae50b5d5e485a3bf097d5ab1df5f896659b4489e63242aa35c1b287641cb9 not found: ID does not exist" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.547790 4939 scope.go:117] "RemoveContainer" containerID="030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379" Mar 18 18:03:52 crc kubenswrapper[4939]: E0318 18:03:52.548161 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379\": container with ID starting with 030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379 not found: ID does not exist" containerID="030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.548178 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379"} err="failed to get container status \"030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379\": rpc error: code = NotFound desc = could not find container \"030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379\": container with ID starting with 030badddaee3e77d9a1c8362f3eb5dc1cf27fec4bcf49762ff2e9511a2545379 not found: ID does not exist" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.548195 4939 scope.go:117] "RemoveContainer" containerID="d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df" Mar 18 18:03:52 crc kubenswrapper[4939]: E0318 18:03:52.548450 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df\": container with ID starting with d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df not found: ID does not exist" containerID="d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df" Mar 18 18:03:52 crc kubenswrapper[4939]: I0318 18:03:52.548472 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df"} err="failed to get container status \"d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df\": rpc error: code = NotFound desc = could not find container \"d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df\": container with ID starting with d94ffaf3212e943454513e4e2d180e21e498383765819a0139e715986fc114df not found: ID does not exist" Mar 18 18:03:53 crc kubenswrapper[4939]: I0318 18:03:53.687702 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:03:53 crc kubenswrapper[4939]: I0318 18:03:53.688046 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:03:53 crc kubenswrapper[4939]: I0318 18:03:53.688094 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:03:53 crc kubenswrapper[4939]: I0318 18:03:53.688648 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:03:53 crc kubenswrapper[4939]: I0318 18:03:53.688691 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" gracePeriod=600 Mar 18 18:03:53 crc kubenswrapper[4939]: E0318 18:03:53.818353 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:03:54 crc kubenswrapper[4939]: I0318 18:03:54.144553 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" path="/var/lib/kubelet/pods/4d966f61-7e4c-486e-923e-0afceff94e7a/volumes" Mar 18 18:03:54 crc kubenswrapper[4939]: I0318 18:03:54.474877 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" exitCode=0 Mar 18 18:03:54 crc kubenswrapper[4939]: I0318 18:03:54.474958 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15"} Mar 18 18:03:54 crc kubenswrapper[4939]: I0318 18:03:54.475212 4939 scope.go:117] "RemoveContainer" containerID="c559432840b726160db88722665a34bfba0202eb63be885cb2468d050d6fcf6a" Mar 18 18:03:54 crc kubenswrapper[4939]: I0318 18:03:54.475885 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:03:54 crc kubenswrapper[4939]: E0318 18:03:54.476168 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.178040 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564284-q59pt"] Mar 18 18:04:00 crc kubenswrapper[4939]: E0318 18:04:00.179255 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="registry-server" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.179267 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="registry-server" Mar 18 18:04:00 crc kubenswrapper[4939]: E0318 18:04:00.179290 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="extract-utilities" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.179297 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="extract-utilities" Mar 18 18:04:00 crc kubenswrapper[4939]: E0318 18:04:00.179325 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="extract-content" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.179331 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="extract-content" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.179558 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d966f61-7e4c-486e-923e-0afceff94e7a" containerName="registry-server" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.180336 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.183280 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.183558 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.183679 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.195236 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-q59pt"] Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.348040 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf8vb\" (UniqueName: \"kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb\") pod \"auto-csr-approver-29564284-q59pt\" (UID: \"ec60c192-b8b8-43c4-9b5b-774bbe78cf83\") " pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.450831 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf8vb\" (UniqueName: \"kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb\") pod \"auto-csr-approver-29564284-q59pt\" (UID: \"ec60c192-b8b8-43c4-9b5b-774bbe78cf83\") " pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.479018 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf8vb\" (UniqueName: \"kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb\") pod \"auto-csr-approver-29564284-q59pt\" (UID: \"ec60c192-b8b8-43c4-9b5b-774bbe78cf83\") " pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:00 crc kubenswrapper[4939]: I0318 18:04:00.547469 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:01 crc kubenswrapper[4939]: I0318 18:04:01.209484 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-q59pt"] Mar 18 18:04:01 crc kubenswrapper[4939]: I0318 18:04:01.618826 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-q59pt" event={"ID":"ec60c192-b8b8-43c4-9b5b-774bbe78cf83","Type":"ContainerStarted","Data":"56512f856a0569f1d5193f3dfaff7deb3b523a7b788e71aa4f5a4e0bb405d110"} Mar 18 18:04:03 crc kubenswrapper[4939]: I0318 18:04:03.646885 4939 generic.go:334] "Generic (PLEG): container finished" podID="ec60c192-b8b8-43c4-9b5b-774bbe78cf83" containerID="766f379c2f61a0ea2e7b941a0b60cfe1acdd2e49a8ed3aea9b4ac2a1152b5042" exitCode=0 Mar 18 18:04:03 crc kubenswrapper[4939]: I0318 18:04:03.646952 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-q59pt" event={"ID":"ec60c192-b8b8-43c4-9b5b-774bbe78cf83","Type":"ContainerDied","Data":"766f379c2f61a0ea2e7b941a0b60cfe1acdd2e49a8ed3aea9b4ac2a1152b5042"} Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.082381 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.233115 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf8vb\" (UniqueName: \"kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb\") pod \"ec60c192-b8b8-43c4-9b5b-774bbe78cf83\" (UID: \"ec60c192-b8b8-43c4-9b5b-774bbe78cf83\") " Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.239646 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb" (OuterVolumeSpecName: "kube-api-access-hf8vb") pod "ec60c192-b8b8-43c4-9b5b-774bbe78cf83" (UID: "ec60c192-b8b8-43c4-9b5b-774bbe78cf83"). InnerVolumeSpecName "kube-api-access-hf8vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.339999 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf8vb\" (UniqueName: \"kubernetes.io/projected/ec60c192-b8b8-43c4-9b5b-774bbe78cf83-kube-api-access-hf8vb\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.675899 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-q59pt" event={"ID":"ec60c192-b8b8-43c4-9b5b-774bbe78cf83","Type":"ContainerDied","Data":"56512f856a0569f1d5193f3dfaff7deb3b523a7b788e71aa4f5a4e0bb405d110"} Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.676377 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56512f856a0569f1d5193f3dfaff7deb3b523a7b788e71aa4f5a4e0bb405d110" Mar 18 18:04:05 crc kubenswrapper[4939]: I0318 18:04:05.676133 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-q59pt" Mar 18 18:04:06 crc kubenswrapper[4939]: I0318 18:04:06.142780 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:04:06 crc kubenswrapper[4939]: E0318 18:04:06.143594 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:04:06 crc kubenswrapper[4939]: I0318 18:04:06.192301 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564278-ffn99"] Mar 18 18:04:06 crc kubenswrapper[4939]: I0318 18:04:06.203051 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564278-ffn99"] Mar 18 18:04:08 crc kubenswrapper[4939]: I0318 18:04:08.153400 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6210c6a-418b-404a-8cf3-6e7c916c73db" path="/var/lib/kubelet/pods/a6210c6a-418b-404a-8cf3-6e7c916c73db/volumes" Mar 18 18:04:12 crc kubenswrapper[4939]: I0318 18:04:12.864354 4939 scope.go:117] "RemoveContainer" containerID="b54f93a3df000c260084afd90d0b6989108ccda487ae1cde601c11b204a28d11" Mar 18 18:04:20 crc kubenswrapper[4939]: I0318 18:04:20.134390 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:04:20 crc kubenswrapper[4939]: E0318 18:04:20.135452 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:04:34 crc kubenswrapper[4939]: I0318 18:04:34.134147 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:04:34 crc kubenswrapper[4939]: E0318 18:04:34.137227 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:04:46 crc kubenswrapper[4939]: I0318 18:04:46.150762 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:04:46 crc kubenswrapper[4939]: E0318 18:04:46.151730 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:05:01 crc kubenswrapper[4939]: I0318 18:05:01.176753 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:05:01 crc kubenswrapper[4939]: E0318 18:05:01.177780 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:05:15 crc kubenswrapper[4939]: I0318 18:05:15.133833 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:05:15 crc kubenswrapper[4939]: E0318 18:05:15.134911 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:05:30 crc kubenswrapper[4939]: I0318 18:05:30.133701 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:05:30 crc kubenswrapper[4939]: E0318 18:05:30.134793 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:05:42 crc kubenswrapper[4939]: I0318 18:05:42.134243 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:05:42 crc kubenswrapper[4939]: E0318 18:05:42.135634 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:05:54 crc kubenswrapper[4939]: I0318 18:05:54.133717 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:05:54 crc kubenswrapper[4939]: E0318 18:05:54.134727 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.163802 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564286-6dbl2"] Mar 18 18:06:00 crc kubenswrapper[4939]: E0318 18:06:00.164911 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec60c192-b8b8-43c4-9b5b-774bbe78cf83" containerName="oc" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.164926 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec60c192-b8b8-43c4-9b5b-774bbe78cf83" containerName="oc" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.165198 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec60c192-b8b8-43c4-9b5b-774bbe78cf83" containerName="oc" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.166193 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.169395 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.169478 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.169532 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.175257 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-6dbl2"] Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.358911 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pss2p\" (UniqueName: \"kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p\") pod \"auto-csr-approver-29564286-6dbl2\" (UID: \"44ef1f08-489e-41a1-b8eb-75b2af95a166\") " pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.461758 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pss2p\" (UniqueName: \"kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p\") pod \"auto-csr-approver-29564286-6dbl2\" (UID: \"44ef1f08-489e-41a1-b8eb-75b2af95a166\") " pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.489742 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pss2p\" (UniqueName: \"kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p\") pod \"auto-csr-approver-29564286-6dbl2\" (UID: \"44ef1f08-489e-41a1-b8eb-75b2af95a166\") " pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:00 crc kubenswrapper[4939]: I0318 18:06:00.787079 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:01 crc kubenswrapper[4939]: I0318 18:06:01.267094 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-6dbl2"] Mar 18 18:06:01 crc kubenswrapper[4939]: I0318 18:06:01.274010 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:06:02 crc kubenswrapper[4939]: I0318 18:06:02.054216 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" event={"ID":"44ef1f08-489e-41a1-b8eb-75b2af95a166","Type":"ContainerStarted","Data":"baa863c860fcad4e83a1ae07fa25ebc252762ab9c778220aaa6b644d9d02f581"} Mar 18 18:06:04 crc kubenswrapper[4939]: I0318 18:06:04.080790 4939 generic.go:334] "Generic (PLEG): container finished" podID="44ef1f08-489e-41a1-b8eb-75b2af95a166" containerID="260472d574bc8f94211fbc2bad8b486c8535e161ddb26d8a40db0ff302dd4efa" exitCode=0 Mar 18 18:06:04 crc kubenswrapper[4939]: I0318 18:06:04.080842 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" event={"ID":"44ef1f08-489e-41a1-b8eb-75b2af95a166","Type":"ContainerDied","Data":"260472d574bc8f94211fbc2bad8b486c8535e161ddb26d8a40db0ff302dd4efa"} Mar 18 18:06:05 crc kubenswrapper[4939]: I0318 18:06:05.557634 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:05 crc kubenswrapper[4939]: I0318 18:06:05.678948 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pss2p\" (UniqueName: \"kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p\") pod \"44ef1f08-489e-41a1-b8eb-75b2af95a166\" (UID: \"44ef1f08-489e-41a1-b8eb-75b2af95a166\") " Mar 18 18:06:05 crc kubenswrapper[4939]: I0318 18:06:05.690347 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p" (OuterVolumeSpecName: "kube-api-access-pss2p") pod "44ef1f08-489e-41a1-b8eb-75b2af95a166" (UID: "44ef1f08-489e-41a1-b8eb-75b2af95a166"). InnerVolumeSpecName "kube-api-access-pss2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:05 crc kubenswrapper[4939]: I0318 18:06:05.781610 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pss2p\" (UniqueName: \"kubernetes.io/projected/44ef1f08-489e-41a1-b8eb-75b2af95a166-kube-api-access-pss2p\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:06 crc kubenswrapper[4939]: I0318 18:06:06.099461 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" event={"ID":"44ef1f08-489e-41a1-b8eb-75b2af95a166","Type":"ContainerDied","Data":"baa863c860fcad4e83a1ae07fa25ebc252762ab9c778220aaa6b644d9d02f581"} Mar 18 18:06:06 crc kubenswrapper[4939]: I0318 18:06:06.099733 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa863c860fcad4e83a1ae07fa25ebc252762ab9c778220aaa6b644d9d02f581" Mar 18 18:06:06 crc kubenswrapper[4939]: I0318 18:06:06.099495 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-6dbl2" Mar 18 18:06:06 crc kubenswrapper[4939]: I0318 18:06:06.674201 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564280-hq8lt"] Mar 18 18:06:06 crc kubenswrapper[4939]: I0318 18:06:06.686099 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564280-hq8lt"] Mar 18 18:06:08 crc kubenswrapper[4939]: I0318 18:06:08.133863 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:06:08 crc kubenswrapper[4939]: E0318 18:06:08.134866 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:06:08 crc kubenswrapper[4939]: I0318 18:06:08.144715 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a987df-f080-4aac-bb9a-bcd146d2e35c" path="/var/lib/kubelet/pods/e5a987df-f080-4aac-bb9a-bcd146d2e35c/volumes" Mar 18 18:06:13 crc kubenswrapper[4939]: I0318 18:06:13.133672 4939 scope.go:117] "RemoveContainer" containerID="aa0fcdb56a6349370a501ba2a4d8a900eab24d28560efccf95b5399edd0a7334" Mar 18 18:06:23 crc kubenswrapper[4939]: I0318 18:06:23.133972 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:06:23 crc kubenswrapper[4939]: E0318 18:06:23.135312 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:06:38 crc kubenswrapper[4939]: I0318 18:06:38.133995 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:06:38 crc kubenswrapper[4939]: E0318 18:06:38.135818 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:06:39 crc kubenswrapper[4939]: I0318 18:06:39.503570 4939 generic.go:334] "Generic (PLEG): container finished" podID="84b96943-32ca-40b7-8139-50e7c64835eb" containerID="4841f54b7e54ef8c3dfd0a5c58525bd2a7556ebe176cf4f0e3797ded21426609" exitCode=0 Mar 18 18:06:39 crc kubenswrapper[4939]: I0318 18:06:39.503667 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" event={"ID":"84b96943-32ca-40b7-8139-50e7c64835eb","Type":"ContainerDied","Data":"4841f54b7e54ef8c3dfd0a5c58525bd2a7556ebe176cf4f0e3797ded21426609"} Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.153172 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.249937 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250087 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250148 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250200 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250226 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250253 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250335 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250371 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250395 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwnnh\" (UniqueName: \"kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250427 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250451 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250470 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.250521 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0\") pod \"84b96943-32ca-40b7-8139-50e7c64835eb\" (UID: \"84b96943-32ca-40b7-8139-50e7c64835eb\") " Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.268607 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh" (OuterVolumeSpecName: "kube-api-access-bwnnh") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "kube-api-access-bwnnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.276835 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph" (OuterVolumeSpecName: "ceph") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.279814 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.282962 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.283446 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.283571 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.289249 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.291708 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.292705 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.295939 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.297463 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.314480 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory" (OuterVolumeSpecName: "inventory") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.318711 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "84b96943-32ca-40b7-8139-50e7c64835eb" (UID: "84b96943-32ca-40b7-8139-50e7c64835eb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353466 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353496 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353523 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353535 4939 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353543 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353553 4939 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353563 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353572 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwnnh\" (UniqueName: \"kubernetes.io/projected/84b96943-32ca-40b7-8139-50e7c64835eb-kube-api-access-bwnnh\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353581 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353589 4939 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353598 4939 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/84b96943-32ca-40b7-8139-50e7c64835eb-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353607 4939 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.353616 4939 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/84b96943-32ca-40b7-8139-50e7c64835eb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.526862 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" event={"ID":"84b96943-32ca-40b7-8139-50e7c64835eb","Type":"ContainerDied","Data":"cca97bf8036082c16b9005062730cdfd59915d6e526d06219770180c7b772f5b"} Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.527212 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca97bf8036082c16b9005062730cdfd59915d6e526d06219770180c7b772f5b" Mar 18 18:06:41 crc kubenswrapper[4939]: I0318 18:06:41.526912 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9" Mar 18 18:06:51 crc kubenswrapper[4939]: I0318 18:06:51.133906 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:06:51 crc kubenswrapper[4939]: E0318 18:06:51.135051 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:07:04 crc kubenswrapper[4939]: I0318 18:07:04.134030 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:07:04 crc kubenswrapper[4939]: E0318 18:07:04.134931 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:07:18 crc kubenswrapper[4939]: I0318 18:07:18.133469 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:07:18 crc kubenswrapper[4939]: E0318 18:07:18.134288 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:07:30 crc kubenswrapper[4939]: I0318 18:07:30.134156 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:07:30 crc kubenswrapper[4939]: E0318 18:07:30.135163 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:07:43 crc kubenswrapper[4939]: I0318 18:07:43.134001 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:07:43 crc kubenswrapper[4939]: E0318 18:07:43.134734 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:07:54 crc kubenswrapper[4939]: I0318 18:07:54.133464 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:07:54 crc kubenswrapper[4939]: E0318 18:07:54.134178 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.233541 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564288-6k2lz"] Mar 18 18:08:00 crc kubenswrapper[4939]: E0318 18:08:00.234313 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ef1f08-489e-41a1-b8eb-75b2af95a166" containerName="oc" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.234326 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ef1f08-489e-41a1-b8eb-75b2af95a166" containerName="oc" Mar 18 18:08:00 crc kubenswrapper[4939]: E0318 18:08:00.234372 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b96943-32ca-40b7-8139-50e7c64835eb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.234379 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b96943-32ca-40b7-8139-50e7c64835eb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.234612 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ef1f08-489e-41a1-b8eb-75b2af95a166" containerName="oc" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.234625 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b96943-32ca-40b7-8139-50e7c64835eb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.235299 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.238799 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.239311 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.240017 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.264229 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-6k2lz"] Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.332915 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgvh\" (UniqueName: \"kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh\") pod \"auto-csr-approver-29564288-6k2lz\" (UID: \"df6fea9c-aa9c-4300-859f-087606188a84\") " pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.435427 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgvh\" (UniqueName: \"kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh\") pod \"auto-csr-approver-29564288-6k2lz\" (UID: \"df6fea9c-aa9c-4300-859f-087606188a84\") " pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.452809 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgvh\" (UniqueName: \"kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh\") pod \"auto-csr-approver-29564288-6k2lz\" (UID: \"df6fea9c-aa9c-4300-859f-087606188a84\") " pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:00 crc kubenswrapper[4939]: I0318 18:08:00.577327 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:01 crc kubenswrapper[4939]: W0318 18:08:01.040556 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6fea9c_aa9c_4300_859f_087606188a84.slice/crio-2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac WatchSource:0}: Error finding container 2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac: Status 404 returned error can't find the container with id 2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac Mar 18 18:08:01 crc kubenswrapper[4939]: I0318 18:08:01.045271 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-6k2lz"] Mar 18 18:08:01 crc kubenswrapper[4939]: I0318 18:08:01.459727 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" event={"ID":"df6fea9c-aa9c-4300-859f-087606188a84","Type":"ContainerStarted","Data":"2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac"} Mar 18 18:08:03 crc kubenswrapper[4939]: I0318 18:08:03.491996 4939 generic.go:334] "Generic (PLEG): container finished" podID="df6fea9c-aa9c-4300-859f-087606188a84" containerID="9ffbd8fcd4290427a8b9cd19a94a4d9b22ba4845e6c5fc2ce4edb9d8bb9beb51" exitCode=0 Mar 18 18:08:03 crc kubenswrapper[4939]: I0318 18:08:03.492268 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" event={"ID":"df6fea9c-aa9c-4300-859f-087606188a84","Type":"ContainerDied","Data":"9ffbd8fcd4290427a8b9cd19a94a4d9b22ba4845e6c5fc2ce4edb9d8bb9beb51"} Mar 18 18:08:05 crc kubenswrapper[4939]: I0318 18:08:05.518396 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" event={"ID":"df6fea9c-aa9c-4300-859f-087606188a84","Type":"ContainerDied","Data":"2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac"} Mar 18 18:08:05 crc kubenswrapper[4939]: I0318 18:08:05.519016 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7feae8b7e56bd6916361daea5e385e60d10617a90611269697a3785dfa30ac" Mar 18 18:08:05 crc kubenswrapper[4939]: I0318 18:08:05.847129 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.002364 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgvh\" (UniqueName: \"kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh\") pod \"df6fea9c-aa9c-4300-859f-087606188a84\" (UID: \"df6fea9c-aa9c-4300-859f-087606188a84\") " Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.014610 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh" (OuterVolumeSpecName: "kube-api-access-8qgvh") pod "df6fea9c-aa9c-4300-859f-087606188a84" (UID: "df6fea9c-aa9c-4300-859f-087606188a84"). InnerVolumeSpecName "kube-api-access-8qgvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.105907 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgvh\" (UniqueName: \"kubernetes.io/projected/df6fea9c-aa9c-4300-859f-087606188a84-kube-api-access-8qgvh\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.527854 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-6k2lz" Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.948002 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564282-5tzll"] Mar 18 18:08:06 crc kubenswrapper[4939]: I0318 18:08:06.959477 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564282-5tzll"] Mar 18 18:08:08 crc kubenswrapper[4939]: I0318 18:08:08.135308 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:08:08 crc kubenswrapper[4939]: E0318 18:08:08.135783 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:08:08 crc kubenswrapper[4939]: I0318 18:08:08.158674 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830ccc0c-82b4-43a8-b3f6-39c17cf92d0d" path="/var/lib/kubelet/pods/830ccc0c-82b4-43a8-b3f6-39c17cf92d0d/volumes" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.127323 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:11 crc kubenswrapper[4939]: E0318 18:08:11.129205 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6fea9c-aa9c-4300-859f-087606188a84" containerName="oc" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.129244 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6fea9c-aa9c-4300-859f-087606188a84" containerName="oc" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.129793 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6fea9c-aa9c-4300-859f-087606188a84" containerName="oc" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.133632 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.152063 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.254028 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.255788 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp8j4\" (UniqueName: \"kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.256149 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.360984 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.361112 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.361267 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp8j4\" (UniqueName: \"kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.361871 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.362285 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.386211 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp8j4\" (UniqueName: \"kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4\") pod \"certified-operators-9qsdz\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:11 crc kubenswrapper[4939]: I0318 18:08:11.473447 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:12 crc kubenswrapper[4939]: I0318 18:08:12.107414 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:12 crc kubenswrapper[4939]: I0318 18:08:12.605878 4939 generic.go:334] "Generic (PLEG): container finished" podID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerID="5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024" exitCode=0 Mar 18 18:08:12 crc kubenswrapper[4939]: I0318 18:08:12.605952 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerDied","Data":"5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024"} Mar 18 18:08:12 crc kubenswrapper[4939]: I0318 18:08:12.605993 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerStarted","Data":"50d8bc8c3a564ae57bb4458e1d5ccac65acc93e5f2f0414fbfa47db0f29e2ed0"} Mar 18 18:08:13 crc kubenswrapper[4939]: I0318 18:08:13.304232 4939 scope.go:117] "RemoveContainer" containerID="3219156d1cb508b7a221950fb3a4ddcd6315bd157d0a575855b6542b7644cb46" Mar 18 18:08:14 crc kubenswrapper[4939]: I0318 18:08:14.640449 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerStarted","Data":"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c"} Mar 18 18:08:15 crc kubenswrapper[4939]: I0318 18:08:15.663593 4939 generic.go:334] "Generic (PLEG): container finished" podID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerID="a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c" exitCode=0 Mar 18 18:08:15 crc kubenswrapper[4939]: I0318 18:08:15.663725 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerDied","Data":"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c"} Mar 18 18:08:16 crc kubenswrapper[4939]: I0318 18:08:16.681780 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerStarted","Data":"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd"} Mar 18 18:08:16 crc kubenswrapper[4939]: I0318 18:08:16.713845 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qsdz" podStartSLOduration=2.13410805 podStartE2EDuration="5.713826777s" podCreationTimestamp="2026-03-18 18:08:11 +0000 UTC" firstStartedPulling="2026-03-18 18:08:12.609414113 +0000 UTC m=+9057.208601774" lastFinishedPulling="2026-03-18 18:08:16.18913285 +0000 UTC m=+9060.788320501" observedRunningTime="2026-03-18 18:08:16.702634609 +0000 UTC m=+9061.301822240" watchObservedRunningTime="2026-03-18 18:08:16.713826777 +0000 UTC m=+9061.313014398" Mar 18 18:08:19 crc kubenswrapper[4939]: I0318 18:08:19.133811 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:08:19 crc kubenswrapper[4939]: E0318 18:08:19.135076 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:08:21 crc kubenswrapper[4939]: I0318 18:08:21.474100 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:21 crc kubenswrapper[4939]: I0318 18:08:21.474157 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:21 crc kubenswrapper[4939]: I0318 18:08:21.556445 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:21 crc kubenswrapper[4939]: I0318 18:08:21.823564 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:21 crc kubenswrapper[4939]: I0318 18:08:21.893411 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:23 crc kubenswrapper[4939]: I0318 18:08:23.764587 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qsdz" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="registry-server" containerID="cri-o://36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd" gracePeriod=2 Mar 18 18:08:23 crc kubenswrapper[4939]: E0318 18:08:23.987203 4939 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de76e75_2523_42c4_a365_432d5bfacfe2.slice/crio-conmon-36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.316012 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.420682 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content\") pod \"2de76e75-2523-42c4-a365-432d5bfacfe2\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.421180 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp8j4\" (UniqueName: \"kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4\") pod \"2de76e75-2523-42c4-a365-432d5bfacfe2\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.421344 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities\") pod \"2de76e75-2523-42c4-a365-432d5bfacfe2\" (UID: \"2de76e75-2523-42c4-a365-432d5bfacfe2\") " Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.422479 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities" (OuterVolumeSpecName: "utilities") pod "2de76e75-2523-42c4-a365-432d5bfacfe2" (UID: "2de76e75-2523-42c4-a365-432d5bfacfe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.432147 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4" (OuterVolumeSpecName: "kube-api-access-xp8j4") pod "2de76e75-2523-42c4-a365-432d5bfacfe2" (UID: "2de76e75-2523-42c4-a365-432d5bfacfe2"). InnerVolumeSpecName "kube-api-access-xp8j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.523499 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp8j4\" (UniqueName: \"kubernetes.io/projected/2de76e75-2523-42c4-a365-432d5bfacfe2-kube-api-access-xp8j4\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.523548 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.573212 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2de76e75-2523-42c4-a365-432d5bfacfe2" (UID: "2de76e75-2523-42c4-a365-432d5bfacfe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.626200 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de76e75-2523-42c4-a365-432d5bfacfe2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.780351 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerDied","Data":"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd"} Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.780365 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qsdz" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.780314 4939 generic.go:334] "Generic (PLEG): container finished" podID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerID="36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd" exitCode=0 Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.780472 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qsdz" event={"ID":"2de76e75-2523-42c4-a365-432d5bfacfe2","Type":"ContainerDied","Data":"50d8bc8c3a564ae57bb4458e1d5ccac65acc93e5f2f0414fbfa47db0f29e2ed0"} Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.780427 4939 scope.go:117] "RemoveContainer" containerID="36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.810198 4939 scope.go:117] "RemoveContainer" containerID="a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.842284 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.858806 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qsdz"] Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.875462 4939 scope.go:117] "RemoveContainer" containerID="5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.899201 4939 scope.go:117] "RemoveContainer" containerID="36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd" Mar 18 18:08:24 crc kubenswrapper[4939]: E0318 18:08:24.899797 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd\": container with ID starting with 36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd not found: ID does not exist" containerID="36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.899873 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd"} err="failed to get container status \"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd\": rpc error: code = NotFound desc = could not find container \"36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd\": container with ID starting with 36cd1b30f13f355e076d484899f7481d436c9d9ae7453ede528061cacf017dcd not found: ID does not exist" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.899906 4939 scope.go:117] "RemoveContainer" containerID="a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c" Mar 18 18:08:24 crc kubenswrapper[4939]: E0318 18:08:24.900276 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c\": container with ID starting with a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c not found: ID does not exist" containerID="a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.900334 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c"} err="failed to get container status \"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c\": rpc error: code = NotFound desc = could not find container \"a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c\": container with ID starting with a652e1ef1a1deae0b96068ff4e4a5c5d489c8900fa6ab79bf738c3f07eda557c not found: ID does not exist" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.900374 4939 scope.go:117] "RemoveContainer" containerID="5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024" Mar 18 18:08:24 crc kubenswrapper[4939]: E0318 18:08:24.901024 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024\": container with ID starting with 5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024 not found: ID does not exist" containerID="5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024" Mar 18 18:08:24 crc kubenswrapper[4939]: I0318 18:08:24.901121 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024"} err="failed to get container status \"5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024\": rpc error: code = NotFound desc = could not find container \"5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024\": container with ID starting with 5068f729bb6f5f54fce5949db01ade3ef9c674fd3bc498c28ec9225a395ea024 not found: ID does not exist" Mar 18 18:08:26 crc kubenswrapper[4939]: I0318 18:08:26.150208 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" path="/var/lib/kubelet/pods/2de76e75-2523-42c4-a365-432d5bfacfe2/volumes" Mar 18 18:08:32 crc kubenswrapper[4939]: I0318 18:08:32.134301 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:08:32 crc kubenswrapper[4939]: E0318 18:08:32.135475 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:08:47 crc kubenswrapper[4939]: I0318 18:08:47.134799 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:08:47 crc kubenswrapper[4939]: E0318 18:08:47.135986 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:09:01 crc kubenswrapper[4939]: I0318 18:09:01.134104 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:09:02 crc kubenswrapper[4939]: I0318 18:09:02.263106 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c"} Mar 18 18:09:46 crc kubenswrapper[4939]: I0318 18:09:46.831032 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 18:09:46 crc kubenswrapper[4939]: I0318 18:09:46.831948 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" containerName="adoption" containerID="cri-o://a68da5733d5ce5776bf61c00c581efe98130866281825ae58c5d7aa958dd897f" gracePeriod=30 Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.178160 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564290-697pj"] Mar 18 18:10:00 crc kubenswrapper[4939]: E0318 18:10:00.179293 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="registry-server" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.179308 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="registry-server" Mar 18 18:10:00 crc kubenswrapper[4939]: E0318 18:10:00.179331 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="extract-utilities" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.179338 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="extract-utilities" Mar 18 18:10:00 crc kubenswrapper[4939]: E0318 18:10:00.179347 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="extract-content" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.179354 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="extract-content" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.179580 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de76e75-2523-42c4-a365-432d5bfacfe2" containerName="registry-server" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.206525 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-697pj"] Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.206637 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.209429 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.209801 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.209858 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.225569 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d4xz\" (UniqueName: \"kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz\") pod \"auto-csr-approver-29564290-697pj\" (UID: \"cb7ac233-85fa-46ee-adfe-bda53b65d98e\") " pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.328474 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d4xz\" (UniqueName: \"kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz\") pod \"auto-csr-approver-29564290-697pj\" (UID: \"cb7ac233-85fa-46ee-adfe-bda53b65d98e\") " pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.356322 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d4xz\" (UniqueName: \"kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz\") pod \"auto-csr-approver-29564290-697pj\" (UID: \"cb7ac233-85fa-46ee-adfe-bda53b65d98e\") " pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:00 crc kubenswrapper[4939]: I0318 18:10:00.534626 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:01 crc kubenswrapper[4939]: I0318 18:10:01.093849 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-697pj"] Mar 18 18:10:02 crc kubenswrapper[4939]: I0318 18:10:02.033456 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-697pj" event={"ID":"cb7ac233-85fa-46ee-adfe-bda53b65d98e","Type":"ContainerStarted","Data":"b9c3792e0532e2a201f095e4bc84c69d41935bed7c6378c92dd825514923e365"} Mar 18 18:10:04 crc kubenswrapper[4939]: I0318 18:10:04.059767 4939 generic.go:334] "Generic (PLEG): container finished" podID="cb7ac233-85fa-46ee-adfe-bda53b65d98e" containerID="cef53e9cfafdcea948f33e89b2f55b261322c4f2d9f5eb6ec4fbe81bbf5344c3" exitCode=0 Mar 18 18:10:04 crc kubenswrapper[4939]: I0318 18:10:04.059833 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-697pj" event={"ID":"cb7ac233-85fa-46ee-adfe-bda53b65d98e","Type":"ContainerDied","Data":"cef53e9cfafdcea948f33e89b2f55b261322c4f2d9f5eb6ec4fbe81bbf5344c3"} Mar 18 18:10:05 crc kubenswrapper[4939]: I0318 18:10:05.462940 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:05 crc kubenswrapper[4939]: I0318 18:10:05.587126 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d4xz\" (UniqueName: \"kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz\") pod \"cb7ac233-85fa-46ee-adfe-bda53b65d98e\" (UID: \"cb7ac233-85fa-46ee-adfe-bda53b65d98e\") " Mar 18 18:10:05 crc kubenswrapper[4939]: I0318 18:10:05.596569 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz" (OuterVolumeSpecName: "kube-api-access-5d4xz") pod "cb7ac233-85fa-46ee-adfe-bda53b65d98e" (UID: "cb7ac233-85fa-46ee-adfe-bda53b65d98e"). InnerVolumeSpecName "kube-api-access-5d4xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:05 crc kubenswrapper[4939]: I0318 18:10:05.690816 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d4xz\" (UniqueName: \"kubernetes.io/projected/cb7ac233-85fa-46ee-adfe-bda53b65d98e-kube-api-access-5d4xz\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:06 crc kubenswrapper[4939]: I0318 18:10:06.085300 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-697pj" event={"ID":"cb7ac233-85fa-46ee-adfe-bda53b65d98e","Type":"ContainerDied","Data":"b9c3792e0532e2a201f095e4bc84c69d41935bed7c6378c92dd825514923e365"} Mar 18 18:10:06 crc kubenswrapper[4939]: I0318 18:10:06.085346 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c3792e0532e2a201f095e4bc84c69d41935bed7c6378c92dd825514923e365" Mar 18 18:10:06 crc kubenswrapper[4939]: I0318 18:10:06.086121 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-697pj" Mar 18 18:10:06 crc kubenswrapper[4939]: I0318 18:10:06.540832 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-q59pt"] Mar 18 18:10:06 crc kubenswrapper[4939]: I0318 18:10:06.550719 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-q59pt"] Mar 18 18:10:08 crc kubenswrapper[4939]: I0318 18:10:08.156190 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec60c192-b8b8-43c4-9b5b-774bbe78cf83" path="/var/lib/kubelet/pods/ec60c192-b8b8-43c4-9b5b-774bbe78cf83/volumes" Mar 18 18:10:13 crc kubenswrapper[4939]: I0318 18:10:13.405787 4939 scope.go:117] "RemoveContainer" containerID="766f379c2f61a0ea2e7b941a0b60cfe1acdd2e49a8ed3aea9b4ac2a1152b5042" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.241519 4939 generic.go:334] "Generic (PLEG): container finished" podID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" containerID="a68da5733d5ce5776bf61c00c581efe98130866281825ae58c5d7aa958dd897f" exitCode=137 Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.241616 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3ac0c304-d5b1-4255-8ff6-6902662b9e37","Type":"ContainerDied","Data":"a68da5733d5ce5776bf61c00c581efe98130866281825ae58c5d7aa958dd897f"} Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.486440 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.611652 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") pod \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.612005 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wswcx\" (UniqueName: \"kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx\") pod \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\" (UID: \"3ac0c304-d5b1-4255-8ff6-6902662b9e37\") " Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.663831 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx" (OuterVolumeSpecName: "kube-api-access-wswcx") pod "3ac0c304-d5b1-4255-8ff6-6902662b9e37" (UID: "3ac0c304-d5b1-4255-8ff6-6902662b9e37"). InnerVolumeSpecName "kube-api-access-wswcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.714376 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wswcx\" (UniqueName: \"kubernetes.io/projected/3ac0c304-d5b1-4255-8ff6-6902662b9e37-kube-api-access-wswcx\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.761086 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc" (OuterVolumeSpecName: "mariadb-data") pod "3ac0c304-d5b1-4255-8ff6-6902662b9e37" (UID: "3ac0c304-d5b1-4255-8ff6-6902662b9e37"). InnerVolumeSpecName "pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.817642 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") on node \"crc\" " Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.848724 4939 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.849725 4939 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc") on node "crc" Mar 18 18:10:17 crc kubenswrapper[4939]: I0318 18:10:17.918744 4939 reconciler_common.go:293] "Volume detached for volume \"pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64d9bdd8-7ffa-43c2-aac0-1be95e6adddc\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.257376 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3ac0c304-d5b1-4255-8ff6-6902662b9e37","Type":"ContainerDied","Data":"99a6c8197d43a324ce61f025880961ae18bad2d3b43a9ec61ca5b8f5bee64ebf"} Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.257785 4939 scope.go:117] "RemoveContainer" containerID="a68da5733d5ce5776bf61c00c581efe98130866281825ae58c5d7aa958dd897f" Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.257551 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.287253 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.298438 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.867527 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 18:10:18 crc kubenswrapper[4939]: I0318 18:10:18.867752 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="7152a687-3421-4f86-9844-74fd521bff9c" containerName="adoption" containerID="cri-o://05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69" gracePeriod=30 Mar 18 18:10:20 crc kubenswrapper[4939]: I0318 18:10:20.153492 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" path="/var/lib/kubelet/pods/3ac0c304-d5b1-4255-8ff6-6902662b9e37/volumes" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.029005 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:40 crc kubenswrapper[4939]: E0318 18:10:40.030220 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7ac233-85fa-46ee-adfe-bda53b65d98e" containerName="oc" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.030237 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7ac233-85fa-46ee-adfe-bda53b65d98e" containerName="oc" Mar 18 18:10:40 crc kubenswrapper[4939]: E0318 18:10:40.030289 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" containerName="adoption" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.030298 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" containerName="adoption" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.030568 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac0c304-d5b1-4255-8ff6-6902662b9e37" containerName="adoption" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.030698 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7ac233-85fa-46ee-adfe-bda53b65d98e" containerName="oc" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.032694 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.065478 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.197854 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.197985 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcvct\" (UniqueName: \"kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.198026 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.299686 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.299868 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcvct\" (UniqueName: \"kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.299904 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.300298 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.300430 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.771383 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcvct\" (UniqueName: \"kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct\") pod \"community-operators-gjct2\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:40 crc kubenswrapper[4939]: I0318 18:10:40.965326 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:41 crc kubenswrapper[4939]: I0318 18:10:41.576962 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:41 crc kubenswrapper[4939]: W0318 18:10:41.586671 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a048c5_fd57_44be_a4ae_12c87a9d042b.slice/crio-19ae10d583d66274c9552d1fd5d21b13867076b11debcd4a66083ebaf9764f19 WatchSource:0}: Error finding container 19ae10d583d66274c9552d1fd5d21b13867076b11debcd4a66083ebaf9764f19: Status 404 returned error can't find the container with id 19ae10d583d66274c9552d1fd5d21b13867076b11debcd4a66083ebaf9764f19 Mar 18 18:10:42 crc kubenswrapper[4939]: I0318 18:10:42.612602 4939 generic.go:334] "Generic (PLEG): container finished" podID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerID="22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b" exitCode=0 Mar 18 18:10:42 crc kubenswrapper[4939]: I0318 18:10:42.612677 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerDied","Data":"22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b"} Mar 18 18:10:42 crc kubenswrapper[4939]: I0318 18:10:42.615554 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerStarted","Data":"19ae10d583d66274c9552d1fd5d21b13867076b11debcd4a66083ebaf9764f19"} Mar 18 18:10:44 crc kubenswrapper[4939]: I0318 18:10:44.643932 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerStarted","Data":"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce"} Mar 18 18:10:45 crc kubenswrapper[4939]: I0318 18:10:45.657337 4939 generic.go:334] "Generic (PLEG): container finished" podID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerID="742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce" exitCode=0 Mar 18 18:10:45 crc kubenswrapper[4939]: I0318 18:10:45.657416 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerDied","Data":"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce"} Mar 18 18:10:46 crc kubenswrapper[4939]: I0318 18:10:46.672781 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerStarted","Data":"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c"} Mar 18 18:10:46 crc kubenswrapper[4939]: I0318 18:10:46.694971 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gjct2" podStartSLOduration=2.999376597 podStartE2EDuration="6.694951563s" podCreationTimestamp="2026-03-18 18:10:40 +0000 UTC" firstStartedPulling="2026-03-18 18:10:42.617850787 +0000 UTC m=+9207.217038448" lastFinishedPulling="2026-03-18 18:10:46.313425793 +0000 UTC m=+9210.912613414" observedRunningTime="2026-03-18 18:10:46.690253619 +0000 UTC m=+9211.289441240" watchObservedRunningTime="2026-03-18 18:10:46.694951563 +0000 UTC m=+9211.294139184" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.369459 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.517695 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert\") pod \"7152a687-3421-4f86-9844-74fd521bff9c\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.519075 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") pod \"7152a687-3421-4f86-9844-74fd521bff9c\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.519192 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfwq\" (UniqueName: \"kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq\") pod \"7152a687-3421-4f86-9844-74fd521bff9c\" (UID: \"7152a687-3421-4f86-9844-74fd521bff9c\") " Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.526115 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "7152a687-3421-4f86-9844-74fd521bff9c" (UID: "7152a687-3421-4f86-9844-74fd521bff9c"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.526486 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq" (OuterVolumeSpecName: "kube-api-access-vhfwq") pod "7152a687-3421-4f86-9844-74fd521bff9c" (UID: "7152a687-3421-4f86-9844-74fd521bff9c"). InnerVolumeSpecName "kube-api-access-vhfwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.543485 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18" (OuterVolumeSpecName: "ovn-data") pod "7152a687-3421-4f86-9844-74fd521bff9c" (UID: "7152a687-3421-4f86-9844-74fd521bff9c"). InnerVolumeSpecName "pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.622712 4939 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/7152a687-3421-4f86-9844-74fd521bff9c-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.622802 4939 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") on node \"crc\" " Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.622826 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfwq\" (UniqueName: \"kubernetes.io/projected/7152a687-3421-4f86-9844-74fd521bff9c-kube-api-access-vhfwq\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.658342 4939 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.658572 4939 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18") on node "crc" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.707316 4939 generic.go:334] "Generic (PLEG): container finished" podID="7152a687-3421-4f86-9844-74fd521bff9c" containerID="05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69" exitCode=137 Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.707358 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.707361 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"7152a687-3421-4f86-9844-74fd521bff9c","Type":"ContainerDied","Data":"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69"} Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.707532 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"7152a687-3421-4f86-9844-74fd521bff9c","Type":"ContainerDied","Data":"5f730db387bab7eba004ac5a2716eda6a3a858793c1fc8c697a2b8b146d0e805"} Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.708561 4939 scope.go:117] "RemoveContainer" containerID="05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.728379 4939 reconciler_common.go:293] "Volume detached for volume \"pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c08441f-c5f2-42cd-a740-e9ca9d5dbb18\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.769420 4939 scope.go:117] "RemoveContainer" containerID="05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69" Mar 18 18:10:49 crc kubenswrapper[4939]: E0318 18:10:49.770994 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69\": container with ID starting with 05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69 not found: ID does not exist" containerID="05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.771034 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69"} err="failed to get container status \"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69\": rpc error: code = NotFound desc = could not find container \"05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69\": container with ID starting with 05000d7942db328f5cd0c7f700fefc183dc4013682907aae9c42169ff6a88a69 not found: ID does not exist" Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.773056 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 18:10:49 crc kubenswrapper[4939]: I0318 18:10:49.781482 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 18:10:50 crc kubenswrapper[4939]: I0318 18:10:50.151657 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7152a687-3421-4f86-9844-74fd521bff9c" path="/var/lib/kubelet/pods/7152a687-3421-4f86-9844-74fd521bff9c/volumes" Mar 18 18:10:50 crc kubenswrapper[4939]: I0318 18:10:50.965708 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:50 crc kubenswrapper[4939]: I0318 18:10:50.966047 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:51 crc kubenswrapper[4939]: I0318 18:10:51.045547 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:51 crc kubenswrapper[4939]: I0318 18:10:51.782210 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:51 crc kubenswrapper[4939]: I0318 18:10:51.863138 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:53 crc kubenswrapper[4939]: I0318 18:10:53.756829 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gjct2" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="registry-server" containerID="cri-o://28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c" gracePeriod=2 Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.275652 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.453310 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcvct\" (UniqueName: \"kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct\") pod \"02a048c5-fd57-44be-a4ae-12c87a9d042b\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.453389 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content\") pod \"02a048c5-fd57-44be-a4ae-12c87a9d042b\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.453493 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities\") pod \"02a048c5-fd57-44be-a4ae-12c87a9d042b\" (UID: \"02a048c5-fd57-44be-a4ae-12c87a9d042b\") " Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.461598 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities" (OuterVolumeSpecName: "utilities") pod "02a048c5-fd57-44be-a4ae-12c87a9d042b" (UID: "02a048c5-fd57-44be-a4ae-12c87a9d042b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.480966 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct" (OuterVolumeSpecName: "kube-api-access-zcvct") pod "02a048c5-fd57-44be-a4ae-12c87a9d042b" (UID: "02a048c5-fd57-44be-a4ae-12c87a9d042b"). InnerVolumeSpecName "kube-api-access-zcvct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.508638 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02a048c5-fd57-44be-a4ae-12c87a9d042b" (UID: "02a048c5-fd57-44be-a4ae-12c87a9d042b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.556435 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcvct\" (UniqueName: \"kubernetes.io/projected/02a048c5-fd57-44be-a4ae-12c87a9d042b-kube-api-access-zcvct\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.556489 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.556549 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a048c5-fd57-44be-a4ae-12c87a9d042b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.775197 4939 generic.go:334] "Generic (PLEG): container finished" podID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerID="28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c" exitCode=0 Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.775256 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerDied","Data":"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c"} Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.775297 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjct2" event={"ID":"02a048c5-fd57-44be-a4ae-12c87a9d042b","Type":"ContainerDied","Data":"19ae10d583d66274c9552d1fd5d21b13867076b11debcd4a66083ebaf9764f19"} Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.775324 4939 scope.go:117] "RemoveContainer" containerID="28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.775361 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjct2" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.821794 4939 scope.go:117] "RemoveContainer" containerID="742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.853003 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.867859 4939 scope.go:117] "RemoveContainer" containerID="22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.873877 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gjct2"] Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.931993 4939 scope.go:117] "RemoveContainer" containerID="28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c" Mar 18 18:10:54 crc kubenswrapper[4939]: E0318 18:10:54.932612 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c\": container with ID starting with 28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c not found: ID does not exist" containerID="28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.932697 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c"} err="failed to get container status \"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c\": rpc error: code = NotFound desc = could not find container \"28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c\": container with ID starting with 28d52d40464ef2bc9f363b3e5713fa09ca59b22f949097cf87263a5a1eac697c not found: ID does not exist" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.932749 4939 scope.go:117] "RemoveContainer" containerID="742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce" Mar 18 18:10:54 crc kubenswrapper[4939]: E0318 18:10:54.933969 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce\": container with ID starting with 742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce not found: ID does not exist" containerID="742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.934014 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce"} err="failed to get container status \"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce\": rpc error: code = NotFound desc = could not find container \"742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce\": container with ID starting with 742cc314e93bd20bf0453b78966032a277835c1b5e5e5f6d665ebf9b0f190bce not found: ID does not exist" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.934041 4939 scope.go:117] "RemoveContainer" containerID="22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b" Mar 18 18:10:54 crc kubenswrapper[4939]: E0318 18:10:54.934485 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b\": container with ID starting with 22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b not found: ID does not exist" containerID="22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b" Mar 18 18:10:54 crc kubenswrapper[4939]: I0318 18:10:54.934551 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b"} err="failed to get container status \"22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b\": rpc error: code = NotFound desc = could not find container \"22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b\": container with ID starting with 22c360e73306be3c66540e654daf87f912f387a6db0932b7a3b7f18ee4f3cd8b not found: ID does not exist" Mar 18 18:10:56 crc kubenswrapper[4939]: I0318 18:10:56.155978 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" path="/var/lib/kubelet/pods/02a048c5-fd57-44be-a4ae-12c87a9d042b/volumes" Mar 18 18:11:23 crc kubenswrapper[4939]: I0318 18:11:23.687580 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:11:23 crc kubenswrapper[4939]: I0318 18:11:23.688289 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.101353 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfhwl/must-gather-r8cqx"] Mar 18 18:11:44 crc kubenswrapper[4939]: E0318 18:11:44.102750 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7152a687-3421-4f86-9844-74fd521bff9c" containerName="adoption" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.102769 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7152a687-3421-4f86-9844-74fd521bff9c" containerName="adoption" Mar 18 18:11:44 crc kubenswrapper[4939]: E0318 18:11:44.102805 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="extract-content" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.102812 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="extract-content" Mar 18 18:11:44 crc kubenswrapper[4939]: E0318 18:11:44.102845 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="registry-server" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.102854 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="registry-server" Mar 18 18:11:44 crc kubenswrapper[4939]: E0318 18:11:44.102867 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="extract-utilities" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.102874 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="extract-utilities" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.103097 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a048c5-fd57-44be-a4ae-12c87a9d042b" containerName="registry-server" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.103118 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7152a687-3421-4f86-9844-74fd521bff9c" containerName="adoption" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.104684 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.113862 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gfhwl"/"openshift-service-ca.crt" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.114237 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gfhwl"/"kube-root-ca.crt" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.121937 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.122039 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.162682 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gfhwl/must-gather-r8cqx"] Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.224166 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.224251 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.225082 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.246824 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm\") pod \"must-gather-r8cqx\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.368781 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.377372 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.385909 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.425077 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.432280 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sph8\" (UniqueName: \"kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.432352 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.432526 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.534927 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.535233 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sph8\" (UniqueName: \"kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.535264 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.535776 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.535986 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.576295 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sph8\" (UniqueName: \"kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8\") pod \"redhat-marketplace-nf67s\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.697698 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:44 crc kubenswrapper[4939]: I0318 18:11:44.991712 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gfhwl/must-gather-r8cqx"] Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.002316 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.004882 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:11:45 crc kubenswrapper[4939]: W0318 18:11:45.005188 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b5b97e_bf7f_4ecc_93f2_8dad1013570d.slice/crio-3cb3dcb43399718ce7362e71f283c3023578afcd824be907d8ca3a3a5b0e55c2 WatchSource:0}: Error finding container 3cb3dcb43399718ce7362e71f283c3023578afcd824be907d8ca3a3a5b0e55c2: Status 404 returned error can't find the container with id 3cb3dcb43399718ce7362e71f283c3023578afcd824be907d8ca3a3a5b0e55c2 Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.509486 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" event={"ID":"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0","Type":"ContainerStarted","Data":"63c3326cf79d645cf947ef311ef70f6a4262ab9f89d54f83414042e3a49c0020"} Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.512040 4939 generic.go:334] "Generic (PLEG): container finished" podID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerID="95d41101abd5ffa625e47c09ddc0bb33657b09ea63b755cd6bafae5008152e76" exitCode=0 Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.512085 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerDied","Data":"95d41101abd5ffa625e47c09ddc0bb33657b09ea63b755cd6bafae5008152e76"} Mar 18 18:11:45 crc kubenswrapper[4939]: I0318 18:11:45.512117 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerStarted","Data":"3cb3dcb43399718ce7362e71f283c3023578afcd824be907d8ca3a3a5b0e55c2"} Mar 18 18:11:47 crc kubenswrapper[4939]: I0318 18:11:47.542159 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerStarted","Data":"d56aef4e9e925ccca0cbf779912fcc2103fce8d5e27b55d0917f8bfea6086229"} Mar 18 18:11:49 crc kubenswrapper[4939]: I0318 18:11:49.565249 4939 generic.go:334] "Generic (PLEG): container finished" podID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerID="d56aef4e9e925ccca0cbf779912fcc2103fce8d5e27b55d0917f8bfea6086229" exitCode=0 Mar 18 18:11:49 crc kubenswrapper[4939]: I0318 18:11:49.565782 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerDied","Data":"d56aef4e9e925ccca0cbf779912fcc2103fce8d5e27b55d0917f8bfea6086229"} Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.608954 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" event={"ID":"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0","Type":"ContainerStarted","Data":"bb26d3a6ff8353baa6ac13d1aaee17cb90f07ec66b9a6b7c2ae768afd4dd2775"} Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.610327 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" event={"ID":"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0","Type":"ContainerStarted","Data":"bbe8217c6ce60546263025f35f06d14756fd9a5c6579e75bfc91a15ebfc16ae0"} Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.612357 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerStarted","Data":"240cfe12ded1435ee27cee1e0d979066ccef43902704931f59f663d2456d2ce0"} Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.635070 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" podStartSLOduration=2.256680124 podStartE2EDuration="9.635049891s" podCreationTimestamp="2026-03-18 18:11:44 +0000 UTC" firstStartedPulling="2026-03-18 18:11:45.002283923 +0000 UTC m=+9269.601471544" lastFinishedPulling="2026-03-18 18:11:52.38065368 +0000 UTC m=+9276.979841311" observedRunningTime="2026-03-18 18:11:53.625959002 +0000 UTC m=+9278.225146623" watchObservedRunningTime="2026-03-18 18:11:53.635049891 +0000 UTC m=+9278.234237522" Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.680224 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nf67s" podStartSLOduration=2.8384918409999997 podStartE2EDuration="9.680201057s" podCreationTimestamp="2026-03-18 18:11:44 +0000 UTC" firstStartedPulling="2026-03-18 18:11:45.514010783 +0000 UTC m=+9270.113198454" lastFinishedPulling="2026-03-18 18:11:52.355720049 +0000 UTC m=+9276.954907670" observedRunningTime="2026-03-18 18:11:53.663236714 +0000 UTC m=+9278.262424345" watchObservedRunningTime="2026-03-18 18:11:53.680201057 +0000 UTC m=+9278.279388688" Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.688201 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:11:53 crc kubenswrapper[4939]: I0318 18:11:53.688267 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:11:54 crc kubenswrapper[4939]: I0318 18:11:54.698369 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:54 crc kubenswrapper[4939]: I0318 18:11:54.700002 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:11:55 crc kubenswrapper[4939]: E0318 18:11:55.640721 4939 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.227:53390->38.102.83.227:41597: write tcp 38.102.83.227:53390->38.102.83.227:41597: write: broken pipe Mar 18 18:11:55 crc kubenswrapper[4939]: I0318 18:11:55.754199 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nf67s" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="registry-server" probeResult="failure" output=< Mar 18 18:11:55 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:11:55 crc kubenswrapper[4939]: > Mar 18 18:11:56 crc kubenswrapper[4939]: I0318 18:11:56.921267 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-9zllf"] Mar 18 18:11:56 crc kubenswrapper[4939]: I0318 18:11:56.923086 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:56 crc kubenswrapper[4939]: I0318 18:11:56.924992 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gfhwl"/"default-dockercfg-xrzg5" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.042197 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.043358 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kph69\" (UniqueName: \"kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.145651 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kph69\" (UniqueName: \"kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.145754 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.145958 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.165763 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kph69\" (UniqueName: \"kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69\") pod \"crc-debug-9zllf\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.241382 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:11:57 crc kubenswrapper[4939]: W0318 18:11:57.272157 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8172615b_b80a_48d1_aa3b_56095dc5dbf7.slice/crio-058a8fb79be24c722633c2a7b449d6cfef4f491060ad707c284c1817a419cde1 WatchSource:0}: Error finding container 058a8fb79be24c722633c2a7b449d6cfef4f491060ad707c284c1817a419cde1: Status 404 returned error can't find the container with id 058a8fb79be24c722633c2a7b449d6cfef4f491060ad707c284c1817a419cde1 Mar 18 18:11:57 crc kubenswrapper[4939]: I0318 18:11:57.659558 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" event={"ID":"8172615b-b80a-48d1-aa3b-56095dc5dbf7","Type":"ContainerStarted","Data":"058a8fb79be24c722633c2a7b449d6cfef4f491060ad707c284c1817a419cde1"} Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.149720 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564292-nx8kc"] Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.151636 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.158876 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-nx8kc"] Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.162619 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.162694 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.162789 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.211816 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtvht\" (UniqueName: \"kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht\") pod \"auto-csr-approver-29564292-nx8kc\" (UID: \"24ca94c0-ce00-476e-8366-611fe2dffc9a\") " pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.312837 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtvht\" (UniqueName: \"kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht\") pod \"auto-csr-approver-29564292-nx8kc\" (UID: \"24ca94c0-ce00-476e-8366-611fe2dffc9a\") " pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.330453 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtvht\" (UniqueName: \"kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht\") pod \"auto-csr-approver-29564292-nx8kc\" (UID: \"24ca94c0-ce00-476e-8366-611fe2dffc9a\") " pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:00 crc kubenswrapper[4939]: I0318 18:12:00.537027 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:01 crc kubenswrapper[4939]: I0318 18:12:01.199424 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-nx8kc"] Mar 18 18:12:01 crc kubenswrapper[4939]: I0318 18:12:01.714228 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" event={"ID":"24ca94c0-ce00-476e-8366-611fe2dffc9a","Type":"ContainerStarted","Data":"12febf9ac68f6750b31339ed07578f395b8393e8ab0fbcebda4a1ff0595f2abe"} Mar 18 18:12:02 crc kubenswrapper[4939]: I0318 18:12:02.730805 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" event={"ID":"24ca94c0-ce00-476e-8366-611fe2dffc9a","Type":"ContainerStarted","Data":"84bb5c39340f8682adb4c38a0319351f06caa278d28fccf084da6db955899c10"} Mar 18 18:12:02 crc kubenswrapper[4939]: I0318 18:12:02.786139 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" podStartSLOduration=1.6329017380000002 podStartE2EDuration="2.786116785s" podCreationTimestamp="2026-03-18 18:12:00 +0000 UTC" firstStartedPulling="2026-03-18 18:12:01.205279624 +0000 UTC m=+9285.804467245" lastFinishedPulling="2026-03-18 18:12:02.358494661 +0000 UTC m=+9286.957682292" observedRunningTime="2026-03-18 18:12:02.756432979 +0000 UTC m=+9287.355620600" watchObservedRunningTime="2026-03-18 18:12:02.786116785 +0000 UTC m=+9287.385304406" Mar 18 18:12:03 crc kubenswrapper[4939]: I0318 18:12:03.740478 4939 generic.go:334] "Generic (PLEG): container finished" podID="24ca94c0-ce00-476e-8366-611fe2dffc9a" containerID="84bb5c39340f8682adb4c38a0319351f06caa278d28fccf084da6db955899c10" exitCode=0 Mar 18 18:12:03 crc kubenswrapper[4939]: I0318 18:12:03.740798 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" event={"ID":"24ca94c0-ce00-476e-8366-611fe2dffc9a","Type":"ContainerDied","Data":"84bb5c39340f8682adb4c38a0319351f06caa278d28fccf084da6db955899c10"} Mar 18 18:12:04 crc kubenswrapper[4939]: I0318 18:12:04.746550 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:12:04 crc kubenswrapper[4939]: I0318 18:12:04.823086 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:12:04 crc kubenswrapper[4939]: I0318 18:12:04.982597 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:12:06 crc kubenswrapper[4939]: I0318 18:12:06.765835 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nf67s" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="registry-server" containerID="cri-o://240cfe12ded1435ee27cee1e0d979066ccef43902704931f59f663d2456d2ce0" gracePeriod=2 Mar 18 18:12:07 crc kubenswrapper[4939]: I0318 18:12:07.780602 4939 generic.go:334] "Generic (PLEG): container finished" podID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerID="240cfe12ded1435ee27cee1e0d979066ccef43902704931f59f663d2456d2ce0" exitCode=0 Mar 18 18:12:07 crc kubenswrapper[4939]: I0318 18:12:07.780664 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerDied","Data":"240cfe12ded1435ee27cee1e0d979066ccef43902704931f59f663d2456d2ce0"} Mar 18 18:12:09 crc kubenswrapper[4939]: I0318 18:12:09.983765 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.138197 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtvht\" (UniqueName: \"kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht\") pod \"24ca94c0-ce00-476e-8366-611fe2dffc9a\" (UID: \"24ca94c0-ce00-476e-8366-611fe2dffc9a\") " Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.281716 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht" (OuterVolumeSpecName: "kube-api-access-wtvht") pod "24ca94c0-ce00-476e-8366-611fe2dffc9a" (UID: "24ca94c0-ce00-476e-8366-611fe2dffc9a"). InnerVolumeSpecName "kube-api-access-wtvht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.347032 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtvht\" (UniqueName: \"kubernetes.io/projected/24ca94c0-ce00-476e-8366-611fe2dffc9a-kube-api-access-wtvht\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.414647 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.583376 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content\") pod \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.583557 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sph8\" (UniqueName: \"kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8\") pod \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.583583 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities\") pod \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\" (UID: \"78b5b97e-bf7f-4ecc-93f2-8dad1013570d\") " Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.584222 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities" (OuterVolumeSpecName: "utilities") pod "78b5b97e-bf7f-4ecc-93f2-8dad1013570d" (UID: "78b5b97e-bf7f-4ecc-93f2-8dad1013570d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.588917 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8" (OuterVolumeSpecName: "kube-api-access-7sph8") pod "78b5b97e-bf7f-4ecc-93f2-8dad1013570d" (UID: "78b5b97e-bf7f-4ecc-93f2-8dad1013570d"). InnerVolumeSpecName "kube-api-access-7sph8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.600331 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b5b97e-bf7f-4ecc-93f2-8dad1013570d" (UID: "78b5b97e-bf7f-4ecc-93f2-8dad1013570d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.687419 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.687593 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sph8\" (UniqueName: \"kubernetes.io/projected/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-kube-api-access-7sph8\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.687607 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b5b97e-bf7f-4ecc-93f2-8dad1013570d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.807721 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" event={"ID":"24ca94c0-ce00-476e-8366-611fe2dffc9a","Type":"ContainerDied","Data":"12febf9ac68f6750b31339ed07578f395b8393e8ab0fbcebda4a1ff0595f2abe"} Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.807768 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12febf9ac68f6750b31339ed07578f395b8393e8ab0fbcebda4a1ff0595f2abe" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.807854 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-nx8kc" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.814971 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf67s" event={"ID":"78b5b97e-bf7f-4ecc-93f2-8dad1013570d","Type":"ContainerDied","Data":"3cb3dcb43399718ce7362e71f283c3023578afcd824be907d8ca3a3a5b0e55c2"} Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.815022 4939 scope.go:117] "RemoveContainer" containerID="240cfe12ded1435ee27cee1e0d979066ccef43902704931f59f663d2456d2ce0" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.815144 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf67s" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.817804 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" event={"ID":"8172615b-b80a-48d1-aa3b-56095dc5dbf7","Type":"ContainerStarted","Data":"9076a9355c90fb5d341997b7c2cd1afaadbdb2a8e259d0e10a03f51052a1b683"} Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.840053 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" podStartSLOduration=2.17378398 podStartE2EDuration="14.84003729s" podCreationTimestamp="2026-03-18 18:11:56 +0000 UTC" firstStartedPulling="2026-03-18 18:11:57.276030151 +0000 UTC m=+9281.875217772" lastFinishedPulling="2026-03-18 18:12:09.942283461 +0000 UTC m=+9294.541471082" observedRunningTime="2026-03-18 18:12:10.830433026 +0000 UTC m=+9295.429620647" watchObservedRunningTime="2026-03-18 18:12:10.84003729 +0000 UTC m=+9295.439224911" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.843925 4939 scope.go:117] "RemoveContainer" containerID="d56aef4e9e925ccca0cbf779912fcc2103fce8d5e27b55d0917f8bfea6086229" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.862559 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.871238 4939 scope.go:117] "RemoveContainer" containerID="95d41101abd5ffa625e47c09ddc0bb33657b09ea63b755cd6bafae5008152e76" Mar 18 18:12:10 crc kubenswrapper[4939]: I0318 18:12:10.871397 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf67s"] Mar 18 18:12:11 crc kubenswrapper[4939]: I0318 18:12:11.064948 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-6dbl2"] Mar 18 18:12:11 crc kubenswrapper[4939]: I0318 18:12:11.073708 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-6dbl2"] Mar 18 18:12:12 crc kubenswrapper[4939]: I0318 18:12:12.147752 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ef1f08-489e-41a1-b8eb-75b2af95a166" path="/var/lib/kubelet/pods/44ef1f08-489e-41a1-b8eb-75b2af95a166/volumes" Mar 18 18:12:12 crc kubenswrapper[4939]: I0318 18:12:12.148957 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" path="/var/lib/kubelet/pods/78b5b97e-bf7f-4ecc-93f2-8dad1013570d/volumes" Mar 18 18:12:13 crc kubenswrapper[4939]: I0318 18:12:13.633276 4939 scope.go:117] "RemoveContainer" containerID="260472d574bc8f94211fbc2bad8b486c8535e161ddb26d8a40db0ff302dd4efa" Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.687048 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.687526 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.687561 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.688302 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.688351 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c" gracePeriod=600 Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.958847 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c" exitCode=0 Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.958900 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c"} Mar 18 18:12:23 crc kubenswrapper[4939]: I0318 18:12:23.958941 4939 scope.go:117] "RemoveContainer" containerID="67753ecd11ce91741304d6ec94bbc5d0215b7baba6a3a37bbef45b10b1017b15" Mar 18 18:12:24 crc kubenswrapper[4939]: I0318 18:12:24.971741 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd"} Mar 18 18:12:37 crc kubenswrapper[4939]: I0318 18:12:37.090949 4939 generic.go:334] "Generic (PLEG): container finished" podID="8172615b-b80a-48d1-aa3b-56095dc5dbf7" containerID="9076a9355c90fb5d341997b7c2cd1afaadbdb2a8e259d0e10a03f51052a1b683" exitCode=0 Mar 18 18:12:37 crc kubenswrapper[4939]: I0318 18:12:37.090991 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" event={"ID":"8172615b-b80a-48d1-aa3b-56095dc5dbf7","Type":"ContainerDied","Data":"9076a9355c90fb5d341997b7c2cd1afaadbdb2a8e259d0e10a03f51052a1b683"} Mar 18 18:12:38 crc kubenswrapper[4939]: I0318 18:12:38.918025 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:12:38 crc kubenswrapper[4939]: I0318 18:12:38.948935 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-9zllf"] Mar 18 18:12:38 crc kubenswrapper[4939]: I0318 18:12:38.957393 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-9zllf"] Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.026408 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kph69\" (UniqueName: \"kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69\") pod \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.026833 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host\") pod \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\" (UID: \"8172615b-b80a-48d1-aa3b-56095dc5dbf7\") " Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.026959 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host" (OuterVolumeSpecName: "host") pod "8172615b-b80a-48d1-aa3b-56095dc5dbf7" (UID: "8172615b-b80a-48d1-aa3b-56095dc5dbf7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.027465 4939 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8172615b-b80a-48d1-aa3b-56095dc5dbf7-host\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.033737 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69" (OuterVolumeSpecName: "kube-api-access-kph69") pod "8172615b-b80a-48d1-aa3b-56095dc5dbf7" (UID: "8172615b-b80a-48d1-aa3b-56095dc5dbf7"). InnerVolumeSpecName "kube-api-access-kph69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.113784 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="058a8fb79be24c722633c2a7b449d6cfef4f491060ad707c284c1817a419cde1" Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.113839 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-9zllf" Mar 18 18:12:39 crc kubenswrapper[4939]: I0318 18:12:39.130061 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kph69\" (UniqueName: \"kubernetes.io/projected/8172615b-b80a-48d1-aa3b-56095dc5dbf7-kube-api-access-kph69\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.149692 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8172615b-b80a-48d1-aa3b-56095dc5dbf7" path="/var/lib/kubelet/pods/8172615b-b80a-48d1-aa3b-56095dc5dbf7/volumes" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.266403 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-x5kjs"] Mar 18 18:12:40 crc kubenswrapper[4939]: E0318 18:12:40.266962 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="extract-utilities" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.266980 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="extract-utilities" Mar 18 18:12:40 crc kubenswrapper[4939]: E0318 18:12:40.267004 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ca94c0-ce00-476e-8366-611fe2dffc9a" containerName="oc" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267012 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ca94c0-ce00-476e-8366-611fe2dffc9a" containerName="oc" Mar 18 18:12:40 crc kubenswrapper[4939]: E0318 18:12:40.267027 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8172615b-b80a-48d1-aa3b-56095dc5dbf7" containerName="container-00" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267036 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="8172615b-b80a-48d1-aa3b-56095dc5dbf7" containerName="container-00" Mar 18 18:12:40 crc kubenswrapper[4939]: E0318 18:12:40.267049 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="registry-server" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267058 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="registry-server" Mar 18 18:12:40 crc kubenswrapper[4939]: E0318 18:12:40.267080 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="extract-content" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267088 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="extract-content" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267338 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b5b97e-bf7f-4ecc-93f2-8dad1013570d" containerName="registry-server" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267363 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="8172615b-b80a-48d1-aa3b-56095dc5dbf7" containerName="container-00" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.267374 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ca94c0-ce00-476e-8366-611fe2dffc9a" containerName="oc" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.268295 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.303608 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gfhwl"/"default-dockercfg-xrzg5" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.359428 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.359929 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88rm\" (UniqueName: \"kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.462174 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.462301 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.462439 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88rm\" (UniqueName: \"kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.782121 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88rm\" (UniqueName: \"kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm\") pod \"crc-debug-x5kjs\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:40 crc kubenswrapper[4939]: I0318 18:12:40.931793 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:41 crc kubenswrapper[4939]: I0318 18:12:41.138660 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" event={"ID":"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde","Type":"ContainerStarted","Data":"f264fe0829b4afd6eeb5ba7560518762371eb1d8b6cc855bd3c8ec830c57dde5"} Mar 18 18:12:42 crc kubenswrapper[4939]: I0318 18:12:42.150258 4939 generic.go:334] "Generic (PLEG): container finished" podID="e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" containerID="64175fe1337bd5c1040c1c01a792348cc4aefe46d138a4ee8f57677b2a588e8b" exitCode=0 Mar 18 18:12:42 crc kubenswrapper[4939]: I0318 18:12:42.150716 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" event={"ID":"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde","Type":"ContainerDied","Data":"64175fe1337bd5c1040c1c01a792348cc4aefe46d138a4ee8f57677b2a588e8b"} Mar 18 18:12:42 crc kubenswrapper[4939]: I0318 18:12:42.335140 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-x5kjs"] Mar 18 18:12:42 crc kubenswrapper[4939]: I0318 18:12:42.348540 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-x5kjs"] Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.276130 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.323584 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h88rm\" (UniqueName: \"kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm\") pod \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.323890 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host\") pod \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\" (UID: \"e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde\") " Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.323947 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host" (OuterVolumeSpecName: "host") pod "e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" (UID: "e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.324525 4939 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-host\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.336079 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm" (OuterVolumeSpecName: "kube-api-access-h88rm") pod "e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" (UID: "e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde"). InnerVolumeSpecName "kube-api-access-h88rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.426864 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h88rm\" (UniqueName: \"kubernetes.io/projected/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde-kube-api-access-h88rm\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.535782 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-dhsxv"] Mar 18 18:12:43 crc kubenswrapper[4939]: E0318 18:12:43.536154 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" containerName="container-00" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.536170 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" containerName="container-00" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.536389 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" containerName="container-00" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.537075 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.629985 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvzss\" (UniqueName: \"kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.630072 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.732682 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvzss\" (UniqueName: \"kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.732777 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.732964 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.750754 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvzss\" (UniqueName: \"kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss\") pod \"crc-debug-dhsxv\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: I0318 18:12:43.867313 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:43 crc kubenswrapper[4939]: W0318 18:12:43.893130 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6269377_3c07_465b_9c52_daeab7031e8e.slice/crio-1a8ebef688efe5cf745519bf829dd67366b8ba03205734f1d9cb2eb8a2a028b5 WatchSource:0}: Error finding container 1a8ebef688efe5cf745519bf829dd67366b8ba03205734f1d9cb2eb8a2a028b5: Status 404 returned error can't find the container with id 1a8ebef688efe5cf745519bf829dd67366b8ba03205734f1d9cb2eb8a2a028b5 Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.145628 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde" path="/var/lib/kubelet/pods/e76f1bd1-1a71-44fd-91e8-a7c5dcb48cde/volumes" Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.179611 4939 scope.go:117] "RemoveContainer" containerID="64175fe1337bd5c1040c1c01a792348cc4aefe46d138a4ee8f57677b2a588e8b" Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.179721 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-x5kjs" Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.181547 4939 generic.go:334] "Generic (PLEG): container finished" podID="f6269377-3c07-465b-9c52-daeab7031e8e" containerID="19321b9cfaa1edb901dbb6b349386d69e67d730b6e40a760576865c2e207e573" exitCode=0 Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.181589 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" event={"ID":"f6269377-3c07-465b-9c52-daeab7031e8e","Type":"ContainerDied","Data":"19321b9cfaa1edb901dbb6b349386d69e67d730b6e40a760576865c2e207e573"} Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.181623 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" event={"ID":"f6269377-3c07-465b-9c52-daeab7031e8e","Type":"ContainerStarted","Data":"1a8ebef688efe5cf745519bf829dd67366b8ba03205734f1d9cb2eb8a2a028b5"} Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.232967 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-dhsxv"] Mar 18 18:12:44 crc kubenswrapper[4939]: I0318 18:12:44.246057 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfhwl/crc-debug-dhsxv"] Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.293482 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.363263 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host\") pod \"f6269377-3c07-465b-9c52-daeab7031e8e\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.363321 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvzss\" (UniqueName: \"kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss\") pod \"f6269377-3c07-465b-9c52-daeab7031e8e\" (UID: \"f6269377-3c07-465b-9c52-daeab7031e8e\") " Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.363351 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host" (OuterVolumeSpecName: "host") pod "f6269377-3c07-465b-9c52-daeab7031e8e" (UID: "f6269377-3c07-465b-9c52-daeab7031e8e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.364750 4939 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6269377-3c07-465b-9c52-daeab7031e8e-host\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.372204 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss" (OuterVolumeSpecName: "kube-api-access-mvzss") pod "f6269377-3c07-465b-9c52-daeab7031e8e" (UID: "f6269377-3c07-465b-9c52-daeab7031e8e"). InnerVolumeSpecName "kube-api-access-mvzss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:45 crc kubenswrapper[4939]: I0318 18:12:45.466434 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvzss\" (UniqueName: \"kubernetes.io/projected/f6269377-3c07-465b-9c52-daeab7031e8e-kube-api-access-mvzss\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:46 crc kubenswrapper[4939]: I0318 18:12:46.143990 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6269377-3c07-465b-9c52-daeab7031e8e" path="/var/lib/kubelet/pods/f6269377-3c07-465b-9c52-daeab7031e8e/volumes" Mar 18 18:12:46 crc kubenswrapper[4939]: I0318 18:12:46.201373 4939 scope.go:117] "RemoveContainer" containerID="19321b9cfaa1edb901dbb6b349386d69e67d730b6e40a760576865c2e207e573" Mar 18 18:12:46 crc kubenswrapper[4939]: I0318 18:12:46.201456 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/crc-debug-dhsxv" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.771444 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:13:37 crc kubenswrapper[4939]: E0318 18:13:37.772291 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6269377-3c07-465b-9c52-daeab7031e8e" containerName="container-00" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.772303 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6269377-3c07-465b-9c52-daeab7031e8e" containerName="container-00" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.772538 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6269377-3c07-465b-9c52-daeab7031e8e" containerName="container-00" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.773927 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.798894 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.798982 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.799726 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxdp\" (UniqueName: \"kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.806747 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.902757 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxdp\" (UniqueName: \"kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.902856 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.902888 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.903403 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.903948 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:37 crc kubenswrapper[4939]: I0318 18:13:37.931270 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxdp\" (UniqueName: \"kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp\") pod \"redhat-operators-pcw6z\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:38 crc kubenswrapper[4939]: I0318 18:13:38.093365 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:39 crc kubenswrapper[4939]: I0318 18:13:39.109517 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:13:39 crc kubenswrapper[4939]: I0318 18:13:39.892320 4939 generic.go:334] "Generic (PLEG): container finished" podID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerID="d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3" exitCode=0 Mar 18 18:13:39 crc kubenswrapper[4939]: I0318 18:13:39.892618 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerDied","Data":"d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3"} Mar 18 18:13:39 crc kubenswrapper[4939]: I0318 18:13:39.893347 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerStarted","Data":"0ff062f212d0c5d3a81e759e8849237be6dc012699a928869040099dbc9573f0"} Mar 18 18:13:41 crc kubenswrapper[4939]: I0318 18:13:41.920725 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerStarted","Data":"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595"} Mar 18 18:13:46 crc kubenswrapper[4939]: I0318 18:13:46.011615 4939 generic.go:334] "Generic (PLEG): container finished" podID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerID="562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595" exitCode=0 Mar 18 18:13:46 crc kubenswrapper[4939]: I0318 18:13:46.011742 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerDied","Data":"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595"} Mar 18 18:13:47 crc kubenswrapper[4939]: I0318 18:13:47.028075 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerStarted","Data":"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785"} Mar 18 18:13:47 crc kubenswrapper[4939]: I0318 18:13:47.092740 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pcw6z" podStartSLOduration=3.538410008 podStartE2EDuration="10.092718306s" podCreationTimestamp="2026-03-18 18:13:37 +0000 UTC" firstStartedPulling="2026-03-18 18:13:39.897058457 +0000 UTC m=+9384.496246078" lastFinishedPulling="2026-03-18 18:13:46.451366715 +0000 UTC m=+9391.050554376" observedRunningTime="2026-03-18 18:13:47.079013096 +0000 UTC m=+9391.678200747" watchObservedRunningTime="2026-03-18 18:13:47.092718306 +0000 UTC m=+9391.691905937" Mar 18 18:13:48 crc kubenswrapper[4939]: I0318 18:13:48.094183 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:48 crc kubenswrapper[4939]: I0318 18:13:48.094648 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:13:49 crc kubenswrapper[4939]: I0318 18:13:49.146745 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pcw6z" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" probeResult="failure" output=< Mar 18 18:13:49 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:13:49 crc kubenswrapper[4939]: > Mar 18 18:13:59 crc kubenswrapper[4939]: I0318 18:13:59.609733 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pcw6z" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" probeResult="failure" output=< Mar 18 18:13:59 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:13:59 crc kubenswrapper[4939]: > Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.208324 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564294-cc5xj"] Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.210050 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.212617 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.213183 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.213317 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.219708 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-cc5xj"] Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.338789 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwb6\" (UniqueName: \"kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6\") pod \"auto-csr-approver-29564294-cc5xj\" (UID: \"3715f5b6-8ba1-4df8-ac62-f6d70af480b7\") " pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.441119 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwb6\" (UniqueName: \"kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6\") pod \"auto-csr-approver-29564294-cc5xj\" (UID: \"3715f5b6-8ba1-4df8-ac62-f6d70af480b7\") " pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.678806 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwb6\" (UniqueName: \"kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6\") pod \"auto-csr-approver-29564294-cc5xj\" (UID: \"3715f5b6-8ba1-4df8-ac62-f6d70af480b7\") " pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:00 crc kubenswrapper[4939]: I0318 18:14:00.838460 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:01 crc kubenswrapper[4939]: I0318 18:14:01.346845 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-cc5xj"] Mar 18 18:14:01 crc kubenswrapper[4939]: W0318 18:14:01.358145 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3715f5b6_8ba1_4df8_ac62_f6d70af480b7.slice/crio-23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832 WatchSource:0}: Error finding container 23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832: Status 404 returned error can't find the container with id 23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832 Mar 18 18:14:02 crc kubenswrapper[4939]: I0318 18:14:02.207305 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" event={"ID":"3715f5b6-8ba1-4df8-ac62-f6d70af480b7","Type":"ContainerStarted","Data":"23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832"} Mar 18 18:14:03 crc kubenswrapper[4939]: I0318 18:14:03.229343 4939 generic.go:334] "Generic (PLEG): container finished" podID="3715f5b6-8ba1-4df8-ac62-f6d70af480b7" containerID="b678080dfc3d4d817c0c6e20335b3a39fcd1cd88a77854a6a1097e77d0f08dd1" exitCode=0 Mar 18 18:14:03 crc kubenswrapper[4939]: I0318 18:14:03.229911 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" event={"ID":"3715f5b6-8ba1-4df8-ac62-f6d70af480b7","Type":"ContainerDied","Data":"b678080dfc3d4d817c0c6e20335b3a39fcd1cd88a77854a6a1097e77d0f08dd1"} Mar 18 18:14:04 crc kubenswrapper[4939]: I0318 18:14:04.619190 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:04 crc kubenswrapper[4939]: I0318 18:14:04.677466 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwb6\" (UniqueName: \"kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6\") pod \"3715f5b6-8ba1-4df8-ac62-f6d70af480b7\" (UID: \"3715f5b6-8ba1-4df8-ac62-f6d70af480b7\") " Mar 18 18:14:04 crc kubenswrapper[4939]: I0318 18:14:04.684659 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6" (OuterVolumeSpecName: "kube-api-access-htwb6") pod "3715f5b6-8ba1-4df8-ac62-f6d70af480b7" (UID: "3715f5b6-8ba1-4df8-ac62-f6d70af480b7"). InnerVolumeSpecName "kube-api-access-htwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:04 crc kubenswrapper[4939]: I0318 18:14:04.780196 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htwb6\" (UniqueName: \"kubernetes.io/projected/3715f5b6-8ba1-4df8-ac62-f6d70af480b7-kube-api-access-htwb6\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:05 crc kubenswrapper[4939]: I0318 18:14:05.260143 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" event={"ID":"3715f5b6-8ba1-4df8-ac62-f6d70af480b7","Type":"ContainerDied","Data":"23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832"} Mar 18 18:14:05 crc kubenswrapper[4939]: I0318 18:14:05.260402 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23103a297dfa2c6abe2c904c7cae55d610521795fc1a84e539d8919ec057d832" Mar 18 18:14:05 crc kubenswrapper[4939]: I0318 18:14:05.260258 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-cc5xj" Mar 18 18:14:05 crc kubenswrapper[4939]: I0318 18:14:05.731938 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-6k2lz"] Mar 18 18:14:05 crc kubenswrapper[4939]: I0318 18:14:05.748909 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-6k2lz"] Mar 18 18:14:06 crc kubenswrapper[4939]: I0318 18:14:06.153114 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6fea9c-aa9c-4300-859f-087606188a84" path="/var/lib/kubelet/pods/df6fea9c-aa9c-4300-859f-087606188a84/volumes" Mar 18 18:14:09 crc kubenswrapper[4939]: I0318 18:14:09.167722 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pcw6z" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" probeResult="failure" output=< Mar 18 18:14:09 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:14:09 crc kubenswrapper[4939]: > Mar 18 18:14:13 crc kubenswrapper[4939]: I0318 18:14:13.809199 4939 scope.go:117] "RemoveContainer" containerID="9ffbd8fcd4290427a8b9cd19a94a4d9b22ba4845e6c5fc2ce4edb9d8bb9beb51" Mar 18 18:14:18 crc kubenswrapper[4939]: I0318 18:14:18.196132 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:14:18 crc kubenswrapper[4939]: I0318 18:14:18.290643 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:14:18 crc kubenswrapper[4939]: I0318 18:14:18.443332 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:14:19 crc kubenswrapper[4939]: I0318 18:14:19.441307 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pcw6z" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" containerID="cri-o://7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785" gracePeriod=2 Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.061079 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.153634 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities\") pod \"723c5c2f-0bee-4647-afa1-06b3250293e8\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.153934 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content\") pod \"723c5c2f-0bee-4647-afa1-06b3250293e8\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.154172 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxdp\" (UniqueName: \"kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp\") pod \"723c5c2f-0bee-4647-afa1-06b3250293e8\" (UID: \"723c5c2f-0bee-4647-afa1-06b3250293e8\") " Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.154782 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities" (OuterVolumeSpecName: "utilities") pod "723c5c2f-0bee-4647-afa1-06b3250293e8" (UID: "723c5c2f-0bee-4647-afa1-06b3250293e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.161954 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp" (OuterVolumeSpecName: "kube-api-access-8gxdp") pod "723c5c2f-0bee-4647-afa1-06b3250293e8" (UID: "723c5c2f-0bee-4647-afa1-06b3250293e8"). InnerVolumeSpecName "kube-api-access-8gxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.257123 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxdp\" (UniqueName: \"kubernetes.io/projected/723c5c2f-0bee-4647-afa1-06b3250293e8-kube-api-access-8gxdp\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.257169 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.323365 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "723c5c2f-0bee-4647-afa1-06b3250293e8" (UID: "723c5c2f-0bee-4647-afa1-06b3250293e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.365931 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c5c2f-0bee-4647-afa1-06b3250293e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.451377 4939 generic.go:334] "Generic (PLEG): container finished" podID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerID="7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785" exitCode=0 Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.451422 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerDied","Data":"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785"} Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.451454 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcw6z" event={"ID":"723c5c2f-0bee-4647-afa1-06b3250293e8","Type":"ContainerDied","Data":"0ff062f212d0c5d3a81e759e8849237be6dc012699a928869040099dbc9573f0"} Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.451471 4939 scope.go:117] "RemoveContainer" containerID="7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.451484 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcw6z" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.471792 4939 scope.go:117] "RemoveContainer" containerID="562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.492901 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.503591 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pcw6z"] Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.517372 4939 scope.go:117] "RemoveContainer" containerID="d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.549781 4939 scope.go:117] "RemoveContainer" containerID="7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785" Mar 18 18:14:20 crc kubenswrapper[4939]: E0318 18:14:20.550234 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785\": container with ID starting with 7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785 not found: ID does not exist" containerID="7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.550267 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785"} err="failed to get container status \"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785\": rpc error: code = NotFound desc = could not find container \"7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785\": container with ID starting with 7a10bea0c6dd5edd435a77be54b89dc3cad9639daaf0e3a011361da68b11e785 not found: ID does not exist" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.550288 4939 scope.go:117] "RemoveContainer" containerID="562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595" Mar 18 18:14:20 crc kubenswrapper[4939]: E0318 18:14:20.550548 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595\": container with ID starting with 562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595 not found: ID does not exist" containerID="562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.550586 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595"} err="failed to get container status \"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595\": rpc error: code = NotFound desc = could not find container \"562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595\": container with ID starting with 562cbd036cd6cf256e26a04d5deca3d64baa69e96c59b9c51b2db38302772595 not found: ID does not exist" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.550612 4939 scope.go:117] "RemoveContainer" containerID="d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3" Mar 18 18:14:20 crc kubenswrapper[4939]: E0318 18:14:20.550877 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3\": container with ID starting with d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3 not found: ID does not exist" containerID="d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3" Mar 18 18:14:20 crc kubenswrapper[4939]: I0318 18:14:20.550899 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3"} err="failed to get container status \"d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3\": rpc error: code = NotFound desc = could not find container \"d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3\": container with ID starting with d9388967b317126a57446ee0b8516fd78bb53a6f9d8c3fda0888bff5a00a07b3 not found: ID does not exist" Mar 18 18:14:22 crc kubenswrapper[4939]: I0318 18:14:22.152664 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" path="/var/lib/kubelet/pods/723c5c2f-0bee-4647-afa1-06b3250293e8/volumes" Mar 18 18:14:53 crc kubenswrapper[4939]: I0318 18:14:53.687230 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:14:53 crc kubenswrapper[4939]: I0318 18:14:53.687791 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.173214 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2"] Mar 18 18:15:00 crc kubenswrapper[4939]: E0318 18:15:00.176096 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3715f5b6-8ba1-4df8-ac62-f6d70af480b7" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.176137 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3715f5b6-8ba1-4df8-ac62-f6d70af480b7" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[4939]: E0318 18:15:00.176189 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.176200 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" Mar 18 18:15:00 crc kubenswrapper[4939]: E0318 18:15:00.176249 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="extract-content" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.176258 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="extract-content" Mar 18 18:15:00 crc kubenswrapper[4939]: E0318 18:15:00.176314 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="extract-utilities" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.176324 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="extract-utilities" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.177188 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="723c5c2f-0bee-4647-afa1-06b3250293e8" containerName="registry-server" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.177241 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3715f5b6-8ba1-4df8-ac62-f6d70af480b7" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.178590 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.186390 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.186990 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.200774 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2"] Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.272376 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.272468 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.272494 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7jgg\" (UniqueName: \"kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.374177 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.374264 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7jgg\" (UniqueName: \"kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.374547 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.377044 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.381248 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.394870 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7jgg\" (UniqueName: \"kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg\") pod \"collect-profiles-29564295-qnjv2\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:00 crc kubenswrapper[4939]: I0318 18:15:00.523206 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:01 crc kubenswrapper[4939]: I0318 18:15:01.023477 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2"] Mar 18 18:15:01 crc kubenswrapper[4939]: I0318 18:15:01.958446 4939 generic.go:334] "Generic (PLEG): container finished" podID="59cafae5-8b14-4269-b97d-7ef188aea7a7" containerID="0437066abe4f366f7bfebde8452623a69e63e9d7495abc68c4bedc1d13fec298" exitCode=0 Mar 18 18:15:01 crc kubenswrapper[4939]: I0318 18:15:01.958577 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" event={"ID":"59cafae5-8b14-4269-b97d-7ef188aea7a7","Type":"ContainerDied","Data":"0437066abe4f366f7bfebde8452623a69e63e9d7495abc68c4bedc1d13fec298"} Mar 18 18:15:01 crc kubenswrapper[4939]: I0318 18:15:01.958855 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" event={"ID":"59cafae5-8b14-4269-b97d-7ef188aea7a7","Type":"ContainerStarted","Data":"16c40d8ceef68eff604fdb0fc793a6f81181b366846209f6857623dbd4d93b7a"} Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.117693 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.285116 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume\") pod \"59cafae5-8b14-4269-b97d-7ef188aea7a7\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.285197 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume\") pod \"59cafae5-8b14-4269-b97d-7ef188aea7a7\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.285290 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7jgg\" (UniqueName: \"kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg\") pod \"59cafae5-8b14-4269-b97d-7ef188aea7a7\" (UID: \"59cafae5-8b14-4269-b97d-7ef188aea7a7\") " Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.285825 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "59cafae5-8b14-4269-b97d-7ef188aea7a7" (UID: "59cafae5-8b14-4269-b97d-7ef188aea7a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.287818 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59cafae5-8b14-4269-b97d-7ef188aea7a7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.297561 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg" (OuterVolumeSpecName: "kube-api-access-l7jgg") pod "59cafae5-8b14-4269-b97d-7ef188aea7a7" (UID: "59cafae5-8b14-4269-b97d-7ef188aea7a7"). InnerVolumeSpecName "kube-api-access-l7jgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.297693 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59cafae5-8b14-4269-b97d-7ef188aea7a7" (UID: "59cafae5-8b14-4269-b97d-7ef188aea7a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.390430 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59cafae5-8b14-4269-b97d-7ef188aea7a7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:04 crc kubenswrapper[4939]: I0318 18:15:04.390481 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7jgg\" (UniqueName: \"kubernetes.io/projected/59cafae5-8b14-4269-b97d-7ef188aea7a7-kube-api-access-l7jgg\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:05 crc kubenswrapper[4939]: I0318 18:15:05.014598 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" event={"ID":"59cafae5-8b14-4269-b97d-7ef188aea7a7","Type":"ContainerDied","Data":"16c40d8ceef68eff604fdb0fc793a6f81181b366846209f6857623dbd4d93b7a"} Mar 18 18:15:05 crc kubenswrapper[4939]: I0318 18:15:05.015044 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16c40d8ceef68eff604fdb0fc793a6f81181b366846209f6857623dbd4d93b7a" Mar 18 18:15:05 crc kubenswrapper[4939]: I0318 18:15:05.014667 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-qnjv2" Mar 18 18:15:05 crc kubenswrapper[4939]: I0318 18:15:05.223351 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq"] Mar 18 18:15:05 crc kubenswrapper[4939]: I0318 18:15:05.238037 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564250-dp5sq"] Mar 18 18:15:06 crc kubenswrapper[4939]: I0318 18:15:06.154707 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7263eb2-dd4c-4cf5-ac7b-eae0748723e2" path="/var/lib/kubelet/pods/e7263eb2-dd4c-4cf5-ac7b-eae0748723e2/volumes" Mar 18 18:15:14 crc kubenswrapper[4939]: I0318 18:15:14.106144 4939 scope.go:117] "RemoveContainer" containerID="044fd6eadf5968fac6fb8d64214674386444b57560c8399c1c6016759c71f945" Mar 18 18:15:23 crc kubenswrapper[4939]: I0318 18:15:23.686867 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:15:23 crc kubenswrapper[4939]: I0318 18:15:23.687446 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:53 crc kubenswrapper[4939]: I0318 18:15:53.687394 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:15:53 crc kubenswrapper[4939]: I0318 18:15:53.687994 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:53 crc kubenswrapper[4939]: I0318 18:15:53.688040 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:15:53 crc kubenswrapper[4939]: I0318 18:15:53.688699 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:15:53 crc kubenswrapper[4939]: I0318 18:15:53.688743 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" gracePeriod=600 Mar 18 18:15:54 crc kubenswrapper[4939]: E0318 18:15:54.326358 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:15:54 crc kubenswrapper[4939]: I0318 18:15:54.671342 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" exitCode=0 Mar 18 18:15:54 crc kubenswrapper[4939]: I0318 18:15:54.671392 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd"} Mar 18 18:15:54 crc kubenswrapper[4939]: I0318 18:15:54.671496 4939 scope.go:117] "RemoveContainer" containerID="baf26336179a9aa7048c63b1536f6ff0057993e0969fa3d21d19f7183d08db2c" Mar 18 18:15:54 crc kubenswrapper[4939]: I0318 18:15:54.677846 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:15:54 crc kubenswrapper[4939]: E0318 18:15:54.679030 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.176065 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564296-5nmxb"] Mar 18 18:16:00 crc kubenswrapper[4939]: E0318 18:16:00.177046 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59cafae5-8b14-4269-b97d-7ef188aea7a7" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.177063 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="59cafae5-8b14-4269-b97d-7ef188aea7a7" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.177321 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="59cafae5-8b14-4269-b97d-7ef188aea7a7" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.178197 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.182403 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.182406 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.183477 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.192194 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-5nmxb"] Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.342702 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8fr\" (UniqueName: \"kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr\") pod \"auto-csr-approver-29564296-5nmxb\" (UID: \"6758c9fd-1c5c-43ba-8123-02f7c4010b0a\") " pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.446075 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8fr\" (UniqueName: \"kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr\") pod \"auto-csr-approver-29564296-5nmxb\" (UID: \"6758c9fd-1c5c-43ba-8123-02f7c4010b0a\") " pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.469914 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8fr\" (UniqueName: \"kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr\") pod \"auto-csr-approver-29564296-5nmxb\" (UID: \"6758c9fd-1c5c-43ba-8123-02f7c4010b0a\") " pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:00 crc kubenswrapper[4939]: I0318 18:16:00.507270 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:01 crc kubenswrapper[4939]: I0318 18:16:01.066025 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-5nmxb"] Mar 18 18:16:01 crc kubenswrapper[4939]: I0318 18:16:01.777537 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" event={"ID":"6758c9fd-1c5c-43ba-8123-02f7c4010b0a","Type":"ContainerStarted","Data":"7768d68fe72b7960308029627ab21a6859159f139c189cda6369b21b20fda0a7"} Mar 18 18:16:02 crc kubenswrapper[4939]: I0318 18:16:02.789990 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" event={"ID":"6758c9fd-1c5c-43ba-8123-02f7c4010b0a","Type":"ContainerStarted","Data":"d896bafeb6a131736fef33485f694ef588481455f75d83adbf16ada3739362b8"} Mar 18 18:16:02 crc kubenswrapper[4939]: I0318 18:16:02.816262 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" podStartSLOduration=1.4618315640000001 podStartE2EDuration="2.816237073s" podCreationTimestamp="2026-03-18 18:16:00 +0000 UTC" firstStartedPulling="2026-03-18 18:16:01.054232417 +0000 UTC m=+9525.653420078" lastFinishedPulling="2026-03-18 18:16:02.408637936 +0000 UTC m=+9527.007825587" observedRunningTime="2026-03-18 18:16:02.805022094 +0000 UTC m=+9527.404209755" watchObservedRunningTime="2026-03-18 18:16:02.816237073 +0000 UTC m=+9527.415424734" Mar 18 18:16:03 crc kubenswrapper[4939]: I0318 18:16:03.808661 4939 generic.go:334] "Generic (PLEG): container finished" podID="6758c9fd-1c5c-43ba-8123-02f7c4010b0a" containerID="d896bafeb6a131736fef33485f694ef588481455f75d83adbf16ada3739362b8" exitCode=0 Mar 18 18:16:03 crc kubenswrapper[4939]: I0318 18:16:03.808752 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" event={"ID":"6758c9fd-1c5c-43ba-8123-02f7c4010b0a","Type":"ContainerDied","Data":"d896bafeb6a131736fef33485f694ef588481455f75d83adbf16ada3739362b8"} Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.211822 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.234916 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8fr\" (UniqueName: \"kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr\") pod \"6758c9fd-1c5c-43ba-8123-02f7c4010b0a\" (UID: \"6758c9fd-1c5c-43ba-8123-02f7c4010b0a\") " Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.241911 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr" (OuterVolumeSpecName: "kube-api-access-gp8fr") pod "6758c9fd-1c5c-43ba-8123-02f7c4010b0a" (UID: "6758c9fd-1c5c-43ba-8123-02f7c4010b0a"). InnerVolumeSpecName "kube-api-access-gp8fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.336742 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8fr\" (UniqueName: \"kubernetes.io/projected/6758c9fd-1c5c-43ba-8123-02f7c4010b0a-kube-api-access-gp8fr\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.852749 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" event={"ID":"6758c9fd-1c5c-43ba-8123-02f7c4010b0a","Type":"ContainerDied","Data":"7768d68fe72b7960308029627ab21a6859159f139c189cda6369b21b20fda0a7"} Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.853020 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7768d68fe72b7960308029627ab21a6859159f139c189cda6369b21b20fda0a7" Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.852829 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-5nmxb" Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.879907 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-697pj"] Mar 18 18:16:05 crc kubenswrapper[4939]: I0318 18:16:05.892925 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-697pj"] Mar 18 18:16:06 crc kubenswrapper[4939]: I0318 18:16:06.152018 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7ac233-85fa-46ee-adfe-bda53b65d98e" path="/var/lib/kubelet/pods/cb7ac233-85fa-46ee-adfe-bda53b65d98e/volumes" Mar 18 18:16:08 crc kubenswrapper[4939]: I0318 18:16:08.133718 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:16:08 crc kubenswrapper[4939]: E0318 18:16:08.135623 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:16:10 crc kubenswrapper[4939]: I0318 18:16:10.772178 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="339c0a2d-5c97-4d3b-84d7-a8731c708236" containerName="galera" probeResult="failure" output="command timed out" Mar 18 18:16:14 crc kubenswrapper[4939]: I0318 18:16:14.211873 4939 scope.go:117] "RemoveContainer" containerID="cef53e9cfafdcea948f33e89b2f55b261322c4f2d9f5eb6ec4fbe81bbf5344c3" Mar 18 18:16:19 crc kubenswrapper[4939]: I0318 18:16:19.134198 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:16:19 crc kubenswrapper[4939]: E0318 18:16:19.137054 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:16:34 crc kubenswrapper[4939]: I0318 18:16:34.133686 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:16:34 crc kubenswrapper[4939]: E0318 18:16:34.134663 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:16:47 crc kubenswrapper[4939]: I0318 18:16:47.132946 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:16:47 crc kubenswrapper[4939]: E0318 18:16:47.133803 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:16:59 crc kubenswrapper[4939]: I0318 18:16:59.133989 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:16:59 crc kubenswrapper[4939]: E0318 18:16:59.135069 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:17:13 crc kubenswrapper[4939]: I0318 18:17:13.134132 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:17:13 crc kubenswrapper[4939]: E0318 18:17:13.135189 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:17:26 crc kubenswrapper[4939]: I0318 18:17:26.154085 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:17:26 crc kubenswrapper[4939]: E0318 18:17:26.155263 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:17:41 crc kubenswrapper[4939]: I0318 18:17:41.132806 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:17:41 crc kubenswrapper[4939]: E0318 18:17:41.133678 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:17:55 crc kubenswrapper[4939]: I0318 18:17:55.134597 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:17:55 crc kubenswrapper[4939]: E0318 18:17:55.135833 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.183330 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564298-88lqw"] Mar 18 18:18:00 crc kubenswrapper[4939]: E0318 18:18:00.184691 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6758c9fd-1c5c-43ba-8123-02f7c4010b0a" containerName="oc" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.184714 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="6758c9fd-1c5c-43ba-8123-02f7c4010b0a" containerName="oc" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.185170 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="6758c9fd-1c5c-43ba-8123-02f7c4010b0a" containerName="oc" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.186587 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.190709 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.191083 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.192138 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.195806 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-88lqw"] Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.197312 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwhn\" (UniqueName: \"kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn\") pod \"auto-csr-approver-29564298-88lqw\" (UID: \"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd\") " pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.301794 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwhn\" (UniqueName: \"kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn\") pod \"auto-csr-approver-29564298-88lqw\" (UID: \"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd\") " pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.338891 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwhn\" (UniqueName: \"kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn\") pod \"auto-csr-approver-29564298-88lqw\" (UID: \"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd\") " pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:00 crc kubenswrapper[4939]: I0318 18:18:00.538169 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:01 crc kubenswrapper[4939]: I0318 18:18:01.052782 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-88lqw"] Mar 18 18:18:01 crc kubenswrapper[4939]: I0318 18:18:01.059277 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:18:01 crc kubenswrapper[4939]: I0318 18:18:01.707145 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-88lqw" event={"ID":"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd","Type":"ContainerStarted","Data":"529d9106c663f15095561550a6b6ba89ce25c9d821b57cc6d31e9bfdd048e49c"} Mar 18 18:18:03 crc kubenswrapper[4939]: I0318 18:18:03.525009 4939 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-zc9lq container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.1.172:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:18:03 crc kubenswrapper[4939]: I0318 18:18:03.525373 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" podUID="cdcc74fa-7b06-489c-bdfc-fe75965f4aa3" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.172:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:18:03 crc kubenswrapper[4939]: I0318 18:18:03.526014 4939 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-zc9lq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.172:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:18:03 crc kubenswrapper[4939]: I0318 18:18:03.526043 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-zc9lq" podUID="cdcc74fa-7b06-489c-bdfc-fe75965f4aa3" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.172:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:18:04 crc kubenswrapper[4939]: I0318 18:18:04.763939 4939 generic.go:334] "Generic (PLEG): container finished" podID="4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" containerID="b1676d2e9f024a0efae51218002bbd246ef9e1ffb5eedb6322d254d92c8671a7" exitCode=0 Mar 18 18:18:04 crc kubenswrapper[4939]: I0318 18:18:04.764008 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-88lqw" event={"ID":"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd","Type":"ContainerDied","Data":"b1676d2e9f024a0efae51218002bbd246ef9e1ffb5eedb6322d254d92c8671a7"} Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.150499 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:18:06 crc kubenswrapper[4939]: E0318 18:18:06.151048 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.244492 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.413087 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wwhn\" (UniqueName: \"kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn\") pod \"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd\" (UID: \"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd\") " Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.419973 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn" (OuterVolumeSpecName: "kube-api-access-4wwhn") pod "4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" (UID: "4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd"). InnerVolumeSpecName "kube-api-access-4wwhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.516080 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wwhn\" (UniqueName: \"kubernetes.io/projected/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd-kube-api-access-4wwhn\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.786926 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-88lqw" event={"ID":"4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd","Type":"ContainerDied","Data":"529d9106c663f15095561550a6b6ba89ce25c9d821b57cc6d31e9bfdd048e49c"} Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.786979 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-88lqw" Mar 18 18:18:06 crc kubenswrapper[4939]: I0318 18:18:06.786982 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="529d9106c663f15095561550a6b6ba89ce25c9d821b57cc6d31e9bfdd048e49c" Mar 18 18:18:07 crc kubenswrapper[4939]: I0318 18:18:07.319734 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-nx8kc"] Mar 18 18:18:07 crc kubenswrapper[4939]: I0318 18:18:07.331872 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-nx8kc"] Mar 18 18:18:08 crc kubenswrapper[4939]: I0318 18:18:08.169791 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ca94c0-ce00-476e-8366-611fe2dffc9a" path="/var/lib/kubelet/pods/24ca94c0-ce00-476e-8366-611fe2dffc9a/volumes" Mar 18 18:18:14 crc kubenswrapper[4939]: I0318 18:18:14.357030 4939 scope.go:117] "RemoveContainer" containerID="9076a9355c90fb5d341997b7c2cd1afaadbdb2a8e259d0e10a03f51052a1b683" Mar 18 18:18:14 crc kubenswrapper[4939]: I0318 18:18:14.404814 4939 scope.go:117] "RemoveContainer" containerID="84bb5c39340f8682adb4c38a0319351f06caa278d28fccf084da6db955899c10" Mar 18 18:18:19 crc kubenswrapper[4939]: I0318 18:18:19.135467 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:18:19 crc kubenswrapper[4939]: E0318 18:18:19.136428 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:18:34 crc kubenswrapper[4939]: I0318 18:18:34.135355 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:18:34 crc kubenswrapper[4939]: E0318 18:18:34.137067 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.475852 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:38 crc kubenswrapper[4939]: E0318 18:18:38.476811 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" containerName="oc" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.476832 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" containerName="oc" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.477202 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" containerName="oc" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.479234 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.495551 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.549252 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljqg\" (UniqueName: \"kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.549351 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.549543 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.651320 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.651409 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljqg\" (UniqueName: \"kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.651480 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.652105 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.652379 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.782711 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljqg\" (UniqueName: \"kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg\") pod \"certified-operators-tlnhf\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:38 crc kubenswrapper[4939]: I0318 18:18:38.820918 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:39 crc kubenswrapper[4939]: I0318 18:18:39.342018 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:40 crc kubenswrapper[4939]: I0318 18:18:40.251082 4939 generic.go:334] "Generic (PLEG): container finished" podID="86563b8a-bd94-463f-9527-1ca104e5794f" containerID="d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4" exitCode=0 Mar 18 18:18:40 crc kubenswrapper[4939]: I0318 18:18:40.251339 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerDied","Data":"d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4"} Mar 18 18:18:40 crc kubenswrapper[4939]: I0318 18:18:40.251449 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerStarted","Data":"746b8b84fea86289cb9d54f16bbc41ff8de423528ddc9c8ec7e605d9d69c953c"} Mar 18 18:18:42 crc kubenswrapper[4939]: I0318 18:18:42.279209 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerStarted","Data":"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a"} Mar 18 18:18:44 crc kubenswrapper[4939]: I0318 18:18:44.316171 4939 generic.go:334] "Generic (PLEG): container finished" podID="86563b8a-bd94-463f-9527-1ca104e5794f" containerID="283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a" exitCode=0 Mar 18 18:18:44 crc kubenswrapper[4939]: I0318 18:18:44.316788 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerDied","Data":"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a"} Mar 18 18:18:45 crc kubenswrapper[4939]: I0318 18:18:45.134021 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:18:45 crc kubenswrapper[4939]: E0318 18:18:45.134514 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:18:45 crc kubenswrapper[4939]: I0318 18:18:45.346190 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerStarted","Data":"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241"} Mar 18 18:18:45 crc kubenswrapper[4939]: I0318 18:18:45.373111 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tlnhf" podStartSLOduration=2.926331208 podStartE2EDuration="7.373089471s" podCreationTimestamp="2026-03-18 18:18:38 +0000 UTC" firstStartedPulling="2026-03-18 18:18:40.253542175 +0000 UTC m=+9684.852729796" lastFinishedPulling="2026-03-18 18:18:44.700300438 +0000 UTC m=+9689.299488059" observedRunningTime="2026-03-18 18:18:45.368006367 +0000 UTC m=+9689.967193998" watchObservedRunningTime="2026-03-18 18:18:45.373089471 +0000 UTC m=+9689.972277102" Mar 18 18:18:48 crc kubenswrapper[4939]: I0318 18:18:48.821905 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:48 crc kubenswrapper[4939]: I0318 18:18:48.822570 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:48 crc kubenswrapper[4939]: I0318 18:18:48.874242 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:49 crc kubenswrapper[4939]: I0318 18:18:49.462211 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:49 crc kubenswrapper[4939]: I0318 18:18:49.518811 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:51 crc kubenswrapper[4939]: I0318 18:18:51.427251 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tlnhf" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="registry-server" containerID="cri-o://bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241" gracePeriod=2 Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.026795 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.178261 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities\") pod \"86563b8a-bd94-463f-9527-1ca104e5794f\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.178439 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content\") pod \"86563b8a-bd94-463f-9527-1ca104e5794f\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.178793 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljqg\" (UniqueName: \"kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg\") pod \"86563b8a-bd94-463f-9527-1ca104e5794f\" (UID: \"86563b8a-bd94-463f-9527-1ca104e5794f\") " Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.179443 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities" (OuterVolumeSpecName: "utilities") pod "86563b8a-bd94-463f-9527-1ca104e5794f" (UID: "86563b8a-bd94-463f-9527-1ca104e5794f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.180183 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.185161 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg" (OuterVolumeSpecName: "kube-api-access-7ljqg") pod "86563b8a-bd94-463f-9527-1ca104e5794f" (UID: "86563b8a-bd94-463f-9527-1ca104e5794f"). InnerVolumeSpecName "kube-api-access-7ljqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.282537 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ljqg\" (UniqueName: \"kubernetes.io/projected/86563b8a-bd94-463f-9527-1ca104e5794f-kube-api-access-7ljqg\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.441749 4939 generic.go:334] "Generic (PLEG): container finished" podID="86563b8a-bd94-463f-9527-1ca104e5794f" containerID="bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241" exitCode=0 Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.441803 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerDied","Data":"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241"} Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.441837 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlnhf" event={"ID":"86563b8a-bd94-463f-9527-1ca104e5794f","Type":"ContainerDied","Data":"746b8b84fea86289cb9d54f16bbc41ff8de423528ddc9c8ec7e605d9d69c953c"} Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.441857 4939 scope.go:117] "RemoveContainer" containerID="bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.441859 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlnhf" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.475958 4939 scope.go:117] "RemoveContainer" containerID="283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.511678 4939 scope.go:117] "RemoveContainer" containerID="d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.584378 4939 scope.go:117] "RemoveContainer" containerID="bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241" Mar 18 18:18:52 crc kubenswrapper[4939]: E0318 18:18:52.585034 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241\": container with ID starting with bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241 not found: ID does not exist" containerID="bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.585074 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241"} err="failed to get container status \"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241\": rpc error: code = NotFound desc = could not find container \"bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241\": container with ID starting with bbd92dff064888a89225623bb5898a5d0a5f09844163c9ab54ab9a9d9f4e4241 not found: ID does not exist" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.585102 4939 scope.go:117] "RemoveContainer" containerID="283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a" Mar 18 18:18:52 crc kubenswrapper[4939]: E0318 18:18:52.585490 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a\": container with ID starting with 283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a not found: ID does not exist" containerID="283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.585547 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a"} err="failed to get container status \"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a\": rpc error: code = NotFound desc = could not find container \"283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a\": container with ID starting with 283502cce5d070901c0c2e20c4a56b7cd5a534ebca49b19459e04684aa65546a not found: ID does not exist" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.585570 4939 scope.go:117] "RemoveContainer" containerID="d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4" Mar 18 18:18:52 crc kubenswrapper[4939]: E0318 18:18:52.585969 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4\": container with ID starting with d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4 not found: ID does not exist" containerID="d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.586001 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4"} err="failed to get container status \"d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4\": rpc error: code = NotFound desc = could not find container \"d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4\": container with ID starting with d4b06bd0e3778182dd6f18cc193eb9a4b92ad6d51c2e4165e89a7a3a9f6e67b4 not found: ID does not exist" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.629740 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86563b8a-bd94-463f-9527-1ca104e5794f" (UID: "86563b8a-bd94-463f-9527-1ca104e5794f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.690958 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86563b8a-bd94-463f-9527-1ca104e5794f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.788334 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:52 crc kubenswrapper[4939]: I0318 18:18:52.802224 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tlnhf"] Mar 18 18:18:54 crc kubenswrapper[4939]: I0318 18:18:54.151039 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" path="/var/lib/kubelet/pods/86563b8a-bd94-463f-9527-1ca104e5794f/volumes" Mar 18 18:18:57 crc kubenswrapper[4939]: I0318 18:18:57.133821 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:18:57 crc kubenswrapper[4939]: E0318 18:18:57.134781 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:19:10 crc kubenswrapper[4939]: I0318 18:19:10.134965 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:19:10 crc kubenswrapper[4939]: E0318 18:19:10.136103 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:19:23 crc kubenswrapper[4939]: I0318 18:19:23.133573 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:19:23 crc kubenswrapper[4939]: E0318 18:19:23.134736 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:19:34 crc kubenswrapper[4939]: I0318 18:19:34.133194 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:19:34 crc kubenswrapper[4939]: E0318 18:19:34.133876 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:19:45 crc kubenswrapper[4939]: I0318 18:19:45.134010 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:19:45 crc kubenswrapper[4939]: E0318 18:19:45.135189 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.134267 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:20:00 crc kubenswrapper[4939]: E0318 18:20:00.135436 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.168988 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564300-59nrl"] Mar 18 18:20:00 crc kubenswrapper[4939]: E0318 18:20:00.169552 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="extract-content" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.169596 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="extract-content" Mar 18 18:20:00 crc kubenswrapper[4939]: E0318 18:20:00.169627 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="registry-server" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.169639 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="registry-server" Mar 18 18:20:00 crc kubenswrapper[4939]: E0318 18:20:00.169677 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="extract-utilities" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.169688 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="extract-utilities" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.170028 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="86563b8a-bd94-463f-9527-1ca104e5794f" containerName="registry-server" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.171213 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.174081 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.174247 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.174451 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.188031 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-59nrl"] Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.257157 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vbj\" (UniqueName: \"kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj\") pod \"auto-csr-approver-29564300-59nrl\" (UID: \"13033ead-e00f-4743-a3cb-67c9eddcc12c\") " pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.359773 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vbj\" (UniqueName: \"kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj\") pod \"auto-csr-approver-29564300-59nrl\" (UID: \"13033ead-e00f-4743-a3cb-67c9eddcc12c\") " pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:00 crc kubenswrapper[4939]: I0318 18:20:00.871320 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vbj\" (UniqueName: \"kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj\") pod \"auto-csr-approver-29564300-59nrl\" (UID: \"13033ead-e00f-4743-a3cb-67c9eddcc12c\") " pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:01 crc kubenswrapper[4939]: I0318 18:20:01.113108 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:01 crc kubenswrapper[4939]: I0318 18:20:01.637414 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-59nrl"] Mar 18 18:20:02 crc kubenswrapper[4939]: I0318 18:20:02.357969 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-59nrl" event={"ID":"13033ead-e00f-4743-a3cb-67c9eddcc12c","Type":"ContainerStarted","Data":"5d6b7757b03ae0f75e5066e35dd154cf3194dc59ca618b73a00b8a22f07b526c"} Mar 18 18:20:04 crc kubenswrapper[4939]: I0318 18:20:04.384194 4939 generic.go:334] "Generic (PLEG): container finished" podID="13033ead-e00f-4743-a3cb-67c9eddcc12c" containerID="2eb42162f75bc63d08b7f99a84709dd9e2bd8585bcd5e71fd37d741c7e001280" exitCode=0 Mar 18 18:20:04 crc kubenswrapper[4939]: I0318 18:20:04.384283 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-59nrl" event={"ID":"13033ead-e00f-4743-a3cb-67c9eddcc12c","Type":"ContainerDied","Data":"2eb42162f75bc63d08b7f99a84709dd9e2bd8585bcd5e71fd37d741c7e001280"} Mar 18 18:20:05 crc kubenswrapper[4939]: I0318 18:20:05.849039 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.013562 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vbj\" (UniqueName: \"kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj\") pod \"13033ead-e00f-4743-a3cb-67c9eddcc12c\" (UID: \"13033ead-e00f-4743-a3cb-67c9eddcc12c\") " Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.022264 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj" (OuterVolumeSpecName: "kube-api-access-77vbj") pod "13033ead-e00f-4743-a3cb-67c9eddcc12c" (UID: "13033ead-e00f-4743-a3cb-67c9eddcc12c"). InnerVolumeSpecName "kube-api-access-77vbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.116347 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vbj\" (UniqueName: \"kubernetes.io/projected/13033ead-e00f-4743-a3cb-67c9eddcc12c-kube-api-access-77vbj\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.409649 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-59nrl" event={"ID":"13033ead-e00f-4743-a3cb-67c9eddcc12c","Type":"ContainerDied","Data":"5d6b7757b03ae0f75e5066e35dd154cf3194dc59ca618b73a00b8a22f07b526c"} Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.409985 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d6b7757b03ae0f75e5066e35dd154cf3194dc59ca618b73a00b8a22f07b526c" Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.409699 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-59nrl" Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.955053 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-cc5xj"] Mar 18 18:20:06 crc kubenswrapper[4939]: I0318 18:20:06.971304 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-cc5xj"] Mar 18 18:20:08 crc kubenswrapper[4939]: I0318 18:20:08.161454 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3715f5b6-8ba1-4df8-ac62-f6d70af480b7" path="/var/lib/kubelet/pods/3715f5b6-8ba1-4df8-ac62-f6d70af480b7/volumes" Mar 18 18:20:13 crc kubenswrapper[4939]: I0318 18:20:13.134429 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:20:13 crc kubenswrapper[4939]: E0318 18:20:13.135598 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:20:14 crc kubenswrapper[4939]: I0318 18:20:14.596374 4939 scope.go:117] "RemoveContainer" containerID="b678080dfc3d4d817c0c6e20335b3a39fcd1cd88a77854a6a1097e77d0f08dd1" Mar 18 18:20:27 crc kubenswrapper[4939]: I0318 18:20:27.133299 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:20:27 crc kubenswrapper[4939]: E0318 18:20:27.134223 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:20:38 crc kubenswrapper[4939]: I0318 18:20:38.134007 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:20:38 crc kubenswrapper[4939]: E0318 18:20:38.134905 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:20:53 crc kubenswrapper[4939]: I0318 18:20:53.132820 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:20:53 crc kubenswrapper[4939]: E0318 18:20:53.133731 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:21:05 crc kubenswrapper[4939]: I0318 18:21:05.133256 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:21:06 crc kubenswrapper[4939]: I0318 18:21:06.220468 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5"} Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.924131 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:11 crc kubenswrapper[4939]: E0318 18:21:11.925091 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13033ead-e00f-4743-a3cb-67c9eddcc12c" containerName="oc" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.925103 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="13033ead-e00f-4743-a3cb-67c9eddcc12c" containerName="oc" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.925311 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="13033ead-e00f-4743-a3cb-67c9eddcc12c" containerName="oc" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.926799 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.951350 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjklh\" (UniqueName: \"kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.951419 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.951439 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:11 crc kubenswrapper[4939]: I0318 18:21:11.971214 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.053063 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjklh\" (UniqueName: \"kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.053171 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.053194 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.053850 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.054113 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.077442 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjklh\" (UniqueName: \"kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh\") pod \"community-operators-k6v65\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.265925 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:12 crc kubenswrapper[4939]: I0318 18:21:12.817037 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:13 crc kubenswrapper[4939]: I0318 18:21:13.323425 4939 generic.go:334] "Generic (PLEG): container finished" podID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerID="f3d468d1cd6c45677bb4bb8b0cac04fcd4ca2c5b21120a39998afb6b560e0637" exitCode=0 Mar 18 18:21:13 crc kubenswrapper[4939]: I0318 18:21:13.323536 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerDied","Data":"f3d468d1cd6c45677bb4bb8b0cac04fcd4ca2c5b21120a39998afb6b560e0637"} Mar 18 18:21:13 crc kubenswrapper[4939]: I0318 18:21:13.323863 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerStarted","Data":"04d2517957e76cf2b86e87213aa2a17e4b9db49fbee9fd738ad17f84e3145f13"} Mar 18 18:21:14 crc kubenswrapper[4939]: I0318 18:21:14.357879 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerStarted","Data":"bb4d0e980483ea36cde8eac59bb0e55b71e475301f060abbbd01c4a213ce6032"} Mar 18 18:21:16 crc kubenswrapper[4939]: I0318 18:21:16.381642 4939 generic.go:334] "Generic (PLEG): container finished" podID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerID="bb4d0e980483ea36cde8eac59bb0e55b71e475301f060abbbd01c4a213ce6032" exitCode=0 Mar 18 18:21:16 crc kubenswrapper[4939]: I0318 18:21:16.381742 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerDied","Data":"bb4d0e980483ea36cde8eac59bb0e55b71e475301f060abbbd01c4a213ce6032"} Mar 18 18:21:18 crc kubenswrapper[4939]: I0318 18:21:18.404391 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerStarted","Data":"31a51509719451dcd6895d65ee33fca20c461bbc88af60f0697362643fb8f1a0"} Mar 18 18:21:18 crc kubenswrapper[4939]: I0318 18:21:18.434442 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6v65" podStartSLOduration=3.308482329 podStartE2EDuration="7.434420911s" podCreationTimestamp="2026-03-18 18:21:11 +0000 UTC" firstStartedPulling="2026-03-18 18:21:13.327801013 +0000 UTC m=+9837.926988634" lastFinishedPulling="2026-03-18 18:21:17.453739585 +0000 UTC m=+9842.052927216" observedRunningTime="2026-03-18 18:21:18.423769839 +0000 UTC m=+9843.022957500" watchObservedRunningTime="2026-03-18 18:21:18.434420911 +0000 UTC m=+9843.033608542" Mar 18 18:21:22 crc kubenswrapper[4939]: I0318 18:21:22.268682 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:22 crc kubenswrapper[4939]: I0318 18:21:22.269653 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:22 crc kubenswrapper[4939]: I0318 18:21:22.350255 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:22 crc kubenswrapper[4939]: I0318 18:21:22.513740 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:22 crc kubenswrapper[4939]: I0318 18:21:22.595669 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:24 crc kubenswrapper[4939]: I0318 18:21:24.472756 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6v65" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="registry-server" containerID="cri-o://31a51509719451dcd6895d65ee33fca20c461bbc88af60f0697362643fb8f1a0" gracePeriod=2 Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.483483 4939 generic.go:334] "Generic (PLEG): container finished" podID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerID="31a51509719451dcd6895d65ee33fca20c461bbc88af60f0697362643fb8f1a0" exitCode=0 Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.483763 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerDied","Data":"31a51509719451dcd6895d65ee33fca20c461bbc88af60f0697362643fb8f1a0"} Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.483876 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6v65" event={"ID":"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3","Type":"ContainerDied","Data":"04d2517957e76cf2b86e87213aa2a17e4b9db49fbee9fd738ad17f84e3145f13"} Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.483896 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d2517957e76cf2b86e87213aa2a17e4b9db49fbee9fd738ad17f84e3145f13" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.624259 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.676864 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjklh\" (UniqueName: \"kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh\") pod \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.677071 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities\") pod \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.677454 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content\") pod \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\" (UID: \"7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3\") " Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.679781 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities" (OuterVolumeSpecName: "utilities") pod "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" (UID: "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.689902 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh" (OuterVolumeSpecName: "kube-api-access-xjklh") pod "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" (UID: "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3"). InnerVolumeSpecName "kube-api-access-xjklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.761110 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" (UID: "7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.780345 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.780375 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:25 crc kubenswrapper[4939]: I0318 18:21:25.780389 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjklh\" (UniqueName: \"kubernetes.io/projected/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3-kube-api-access-xjklh\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:26 crc kubenswrapper[4939]: I0318 18:21:26.496313 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6v65" Mar 18 18:21:26 crc kubenswrapper[4939]: I0318 18:21:26.537643 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:26 crc kubenswrapper[4939]: I0318 18:21:26.557412 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6v65"] Mar 18 18:21:28 crc kubenswrapper[4939]: I0318 18:21:28.155678 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" path="/var/lib/kubelet/pods/7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3/volumes" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.149403 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564302-sfgst"] Mar 18 18:22:00 crc kubenswrapper[4939]: E0318 18:22:00.150175 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="registry-server" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.150189 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="registry-server" Mar 18 18:22:00 crc kubenswrapper[4939]: E0318 18:22:00.150211 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="extract-content" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.150216 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="extract-content" Mar 18 18:22:00 crc kubenswrapper[4939]: E0318 18:22:00.150251 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="extract-utilities" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.150257 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="extract-utilities" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.150458 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5f29eb-6901-4b0c-b3b7-4fb3d2ee3ff3" containerName="registry-server" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.151222 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.164577 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.166000 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.166001 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.174883 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-sfgst"] Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.199170 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxkjg\" (UniqueName: \"kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg\") pod \"auto-csr-approver-29564302-sfgst\" (UID: \"a734a071-0c7a-48a7-a272-dbcef14a572d\") " pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.301484 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxkjg\" (UniqueName: \"kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg\") pod \"auto-csr-approver-29564302-sfgst\" (UID: \"a734a071-0c7a-48a7-a272-dbcef14a572d\") " pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.322678 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxkjg\" (UniqueName: \"kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg\") pod \"auto-csr-approver-29564302-sfgst\" (UID: \"a734a071-0c7a-48a7-a272-dbcef14a572d\") " pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:00 crc kubenswrapper[4939]: I0318 18:22:00.478902 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:01 crc kubenswrapper[4939]: I0318 18:22:01.014988 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-sfgst"] Mar 18 18:22:01 crc kubenswrapper[4939]: I0318 18:22:01.940709 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-sfgst" event={"ID":"a734a071-0c7a-48a7-a272-dbcef14a572d","Type":"ContainerStarted","Data":"beb36c2e96d106521399e5878c817f98a843bf32b3cf4ba3f48bbf6718ad31dd"} Mar 18 18:22:02 crc kubenswrapper[4939]: I0318 18:22:02.952358 4939 generic.go:334] "Generic (PLEG): container finished" podID="a734a071-0c7a-48a7-a272-dbcef14a572d" containerID="d0d48c3f04f0111b7a996bb40bbf4f8bd1f9a34184fb614148ae886e44b78958" exitCode=0 Mar 18 18:22:02 crc kubenswrapper[4939]: I0318 18:22:02.952531 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-sfgst" event={"ID":"a734a071-0c7a-48a7-a272-dbcef14a572d","Type":"ContainerDied","Data":"d0d48c3f04f0111b7a996bb40bbf4f8bd1f9a34184fb614148ae886e44b78958"} Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.813180 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.914787 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxkjg\" (UniqueName: \"kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg\") pod \"a734a071-0c7a-48a7-a272-dbcef14a572d\" (UID: \"a734a071-0c7a-48a7-a272-dbcef14a572d\") " Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.921786 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg" (OuterVolumeSpecName: "kube-api-access-cxkjg") pod "a734a071-0c7a-48a7-a272-dbcef14a572d" (UID: "a734a071-0c7a-48a7-a272-dbcef14a572d"). InnerVolumeSpecName "kube-api-access-cxkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.977013 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-sfgst" event={"ID":"a734a071-0c7a-48a7-a272-dbcef14a572d","Type":"ContainerDied","Data":"beb36c2e96d106521399e5878c817f98a843bf32b3cf4ba3f48bbf6718ad31dd"} Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.977081 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beb36c2e96d106521399e5878c817f98a843bf32b3cf4ba3f48bbf6718ad31dd" Mar 18 18:22:04 crc kubenswrapper[4939]: I0318 18:22:04.977172 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-sfgst" Mar 18 18:22:05 crc kubenswrapper[4939]: I0318 18:22:05.019340 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxkjg\" (UniqueName: \"kubernetes.io/projected/a734a071-0c7a-48a7-a272-dbcef14a572d-kube-api-access-cxkjg\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:05 crc kubenswrapper[4939]: I0318 18:22:05.885087 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-5nmxb"] Mar 18 18:22:05 crc kubenswrapper[4939]: I0318 18:22:05.897336 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-5nmxb"] Mar 18 18:22:06 crc kubenswrapper[4939]: I0318 18:22:06.144697 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6758c9fd-1c5c-43ba-8123-02f7c4010b0a" path="/var/lib/kubelet/pods/6758c9fd-1c5c-43ba-8123-02f7c4010b0a/volumes" Mar 18 18:22:15 crc kubenswrapper[4939]: I0318 18:22:15.810973 4939 scope.go:117] "RemoveContainer" containerID="d896bafeb6a131736fef33485f694ef588481455f75d83adbf16ada3739362b8" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.732600 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:26 crc kubenswrapper[4939]: E0318 18:22:26.733899 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a734a071-0c7a-48a7-a272-dbcef14a572d" containerName="oc" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.733921 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="a734a071-0c7a-48a7-a272-dbcef14a572d" containerName="oc" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.734207 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="a734a071-0c7a-48a7-a272-dbcef14a572d" containerName="oc" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.736266 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.752486 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.918835 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.918949 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:26 crc kubenswrapper[4939]: I0318 18:22:26.919055 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5gxs\" (UniqueName: \"kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.020644 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5gxs\" (UniqueName: \"kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.020785 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.020844 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.021262 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.021332 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.038931 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5gxs\" (UniqueName: \"kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs\") pod \"redhat-marketplace-pqp7q\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.090662 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:27 crc kubenswrapper[4939]: I0318 18:22:27.550598 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:27 crc kubenswrapper[4939]: W0318 18:22:27.552982 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5a070c_5e61_450a_9f90_71f0b4d070a5.slice/crio-757f86a0a28649eb11a37e8b881f4afe97ed8d49ff68f382025641d74734014f WatchSource:0}: Error finding container 757f86a0a28649eb11a37e8b881f4afe97ed8d49ff68f382025641d74734014f: Status 404 returned error can't find the container with id 757f86a0a28649eb11a37e8b881f4afe97ed8d49ff68f382025641d74734014f Mar 18 18:22:28 crc kubenswrapper[4939]: I0318 18:22:28.244150 4939 generic.go:334] "Generic (PLEG): container finished" podID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerID="1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72" exitCode=0 Mar 18 18:22:28 crc kubenswrapper[4939]: I0318 18:22:28.244265 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerDied","Data":"1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72"} Mar 18 18:22:28 crc kubenswrapper[4939]: I0318 18:22:28.244462 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerStarted","Data":"757f86a0a28649eb11a37e8b881f4afe97ed8d49ff68f382025641d74734014f"} Mar 18 18:22:29 crc kubenswrapper[4939]: I0318 18:22:29.722470 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8852984-80af-496e-b8c0-95374efe6bff/init-config-reloader/0.log" Mar 18 18:22:29 crc kubenswrapper[4939]: I0318 18:22:29.875265 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8852984-80af-496e-b8c0-95374efe6bff/init-config-reloader/0.log" Mar 18 18:22:29 crc kubenswrapper[4939]: I0318 18:22:29.945000 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8852984-80af-496e-b8c0-95374efe6bff/alertmanager/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.018068 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8852984-80af-496e-b8c0-95374efe6bff/config-reloader/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.130877 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d425e944-b4cd-44e3-89e6-da39e5c92773/aodh-api/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.195561 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d425e944-b4cd-44e3-89e6-da39e5c92773/aodh-evaluator/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.234244 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d425e944-b4cd-44e3-89e6-da39e5c92773/aodh-listener/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.269940 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerStarted","Data":"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110"} Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.366135 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d425e944-b4cd-44e3-89e6-da39e5c92773/aodh-notifier/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.437862 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7447b48946-z557b_06d0f80a-f850-4078-8522-5e750e5d58eb/barbican-api/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.473470 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7447b48946-z557b_06d0f80a-f850-4078-8522-5e750e5d58eb/barbican-api-log/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.635405 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68c6d787b6-lttfq_e322cd3d-c091-4958-92fd-65512870096f/barbican-keystone-listener/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.738990 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68c6d787b6-lttfq_e322cd3d-c091-4958-92fd-65512870096f/barbican-keystone-listener-log/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.870586 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cd5655f5f-5w68p_d5c95d52-2520-4a9f-b32b-9023caca3572/barbican-worker/0.log" Mar 18 18:22:30 crc kubenswrapper[4939]: I0318 18:22:30.904003 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cd5655f5f-5w68p_d5c95d52-2520-4a9f-b32b-9023caca3572/barbican-worker-log/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.086612 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-tk5r5_44675ae9-2f87-4dff-bc11-602ed205461b/bootstrap-openstack-openstack-cell1/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.153239 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ca0fea8-0244-4bff-b58f-6073f0ab19cc/ceilometer-central-agent/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.280179 4939 generic.go:334] "Generic (PLEG): container finished" podID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerID="bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110" exitCode=0 Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.280226 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerDied","Data":"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110"} Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.290270 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ca0fea8-0244-4bff-b58f-6073f0ab19cc/ceilometer-notification-agent/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.345801 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ca0fea8-0244-4bff-b58f-6073f0ab19cc/proxy-httpd/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.375288 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ca0fea8-0244-4bff-b58f-6073f0ab19cc/sg-core/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.550789 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-2fzgn_371f7f00-12c5-423a-aa1b-a6c713d52961/ceph-client-openstack-openstack-cell1/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.666146 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a687e985-0518-48b9-84d0-94522b686c53/cinder-api/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.732380 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a687e985-0518-48b9-84d0-94522b686c53/cinder-api-log/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.931694 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_430abd0b-b430-4929-87c2-76529cddc1be/probe/0.log" Mar 18 18:22:31 crc kubenswrapper[4939]: I0318 18:22:31.936537 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_430abd0b-b430-4929-87c2-76529cddc1be/cinder-backup/0.log" Mar 18 18:22:32 crc kubenswrapper[4939]: I0318 18:22:32.352089 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_954c59a4-5386-4b38-b56f-835fbc868b42/cinder-scheduler/0.log" Mar 18 18:22:32 crc kubenswrapper[4939]: I0318 18:22:32.450862 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_954c59a4-5386-4b38-b56f-835fbc868b42/probe/0.log" Mar 18 18:22:32 crc kubenswrapper[4939]: I0318 18:22:32.455303 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9cdb31af-1ef0-499e-9400-ce05d06c693f/cinder-volume/0.log" Mar 18 18:22:32 crc kubenswrapper[4939]: I0318 18:22:32.685526 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9cdb31af-1ef0-499e-9400-ce05d06c693f/probe/0.log" Mar 18 18:22:32 crc kubenswrapper[4939]: I0318 18:22:32.829681 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-f9cwm_d8999dd3-4065-4ecc-b797-e62aa63e1bcb/configure-network-openstack-openstack-cell1/0.log" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.048828 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-8x8pz_efc5f76b-7c2a-439b-b176-cc74498f19c4/configure-os-openstack-openstack-cell1/0.log" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.083352 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-4pml7_6071a760-6039-4785-b7a9-dc01f0558392/init/0.log" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.331183 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerStarted","Data":"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee"} Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.366407 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqp7q" podStartSLOduration=3.240755424 podStartE2EDuration="7.366388922s" podCreationTimestamp="2026-03-18 18:22:26 +0000 UTC" firstStartedPulling="2026-03-18 18:22:28.247209149 +0000 UTC m=+9912.846396770" lastFinishedPulling="2026-03-18 18:22:32.372842657 +0000 UTC m=+9916.972030268" observedRunningTime="2026-03-18 18:22:33.356900973 +0000 UTC m=+9917.956088594" watchObservedRunningTime="2026-03-18 18:22:33.366388922 +0000 UTC m=+9917.965576543" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.674482 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-4pml7_6071a760-6039-4785-b7a9-dc01f0558392/dnsmasq-dns/0.log" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.678184 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-4pml7_6071a760-6039-4785-b7a9-dc01f0558392/init/0.log" Mar 18 18:22:33 crc kubenswrapper[4939]: I0318 18:22:33.787064 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-fs8vm_ad49efa4-1ab3-4342-b9eb-ad8c1d1cc27a/download-cache-openstack-openstack-cell1/0.log" Mar 18 18:22:34 crc kubenswrapper[4939]: I0318 18:22:34.098038 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_79f6f675-b8f8-4aba-ad72-f22358057ad0/glance-log/0.log" Mar 18 18:22:34 crc kubenswrapper[4939]: I0318 18:22:34.137665 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_79f6f675-b8f8-4aba-ad72-f22358057ad0/glance-httpd/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.312236 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_60b2f1f6-9d42-465a-a193-b70288373cd3/glance-httpd/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.346777 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_60b2f1f6-9d42-465a-a193-b70288373cd3/glance-log/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.392235 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6fd9d6d4d6-crbpz_dd05e31f-c125-4408-88dd-c95a60e1e05d/heat-api/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.612610 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-684bfd6b7d-5ncdg_619ea53f-adf0-4b62-8070-d26ce9739a92/heat-engine/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.648191 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7dfbc5946-xgvcs_a5be3887-44b0-45e5-ab55-8b70aba4b0bb/heat-cfnapi/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.829618 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d866dd55-p86pc_12b67c3c-da84-435f-b25e-f45575becb1b/horizon/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.882289 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-49s8g_d9729542-303e-47cd-b312-6e3bc1cd7b51/install-certs-openstack-openstack-cell1/0.log" Mar 18 18:22:35 crc kubenswrapper[4939]: I0318 18:22:35.918706 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d866dd55-p86pc_12b67c3c-da84-435f-b25e-f45575becb1b/horizon-log/0.log" Mar 18 18:22:36 crc kubenswrapper[4939]: I0318 18:22:36.256204 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-859d47c774-sp5pm_cac68db9-5855-4b55-81aa-124ad97bc6a5/keystone-api/0.log" Mar 18 18:22:36 crc kubenswrapper[4939]: I0318 18:22:36.466454 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-hwz5d_a2bb63d4-ae06-4c42-b037-202212427175/install-os-openstack-openstack-cell1/0.log" Mar 18 18:22:36 crc kubenswrapper[4939]: I0318 18:22:36.644880 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564281-dvxjc_b3681d41-eec6-4744-83d9-25b409e189d6/keystone-cron/0.log" Mar 18 18:22:36 crc kubenswrapper[4939]: I0318 18:22:36.720632 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_37664671-b80f-472e-bd20-b23186d6e808/kube-state-metrics/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.014060 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ae01d133-287c-44ba-bdc8-76484241c92b/manila-api-log/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.090821 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.091972 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.148067 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.172463 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ae01d133-287c-44ba-bdc8-76484241c92b/manila-api/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.212086 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_655b8ea9-b797-4292-a872-7c1db39b5dac/probe/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.264707 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_655b8ea9-b797-4292-a872-7c1db39b5dac/manila-scheduler/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.364895 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-2rtf4_8e250960-80eb-4d0e-b653-d1a794760ac8/libvirt-openstack-openstack-cell1/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.380160 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0d03e659-67b2-44e9-aebd-4fdaac7f806b/manila-share/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.460420 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.537398 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.618197 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0d03e659-67b2-44e9-aebd-4fdaac7f806b/probe/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.850921 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bcf85947c-k99kg_859a941c-b861-4d33-9bcc-816f56b24c41/neutron-api/0.log" Mar 18 18:22:37 crc kubenswrapper[4939]: I0318 18:22:37.908928 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bcf85947c-k99kg_859a941c-b861-4d33-9bcc-816f56b24c41/neutron-httpd/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.251426 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-26rfs_10576dfe-22f5-4937-bd11-149ca982c4f2/neutron-metadata-openstack-openstack-cell1/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.301026 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-2lrx4_cbafcc37-e8cb-4cb5-96aa-00a063f4c003/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.407569 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-r2qr2_330b1db4-a22e-4fe0-a9c5-88ae9458db36/neutron-sriov-openstack-openstack-cell1/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.522746 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_241b524a-2440-403d-ad85-8060e5df7c74/nova-api-api/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.669190 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_241b524a-2440-403d-ad85-8060e5df7c74/nova-api-log/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.833943 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_19e1a5e8-cac2-4f80-ad14-f4506492d912/nova-cell0-conductor-conductor/0.log" Mar 18 18:22:38 crc kubenswrapper[4939]: I0318 18:22:38.914411 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_350a8616-a19e-465c-a58c-489b897b6bf5/nova-cell1-conductor-conductor/0.log" Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.123431 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_09e6d4eb-edae-4e88-915c-3e83db439676/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.409211 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqp7q" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="registry-server" containerID="cri-o://f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee" gracePeriod=2 Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.568343 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ac312cd1-c869-4b32-834f-a18a634531e9/nova-metadata-log/0.log" Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.647703 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ac312cd1-c869-4b32-834f-a18a634531e9/nova-metadata-metadata/0.log" Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.938988 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellwmfx9_84b96943-32ca-40b7-8139-50e7c64835eb/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 18 18:22:39 crc kubenswrapper[4939]: I0318 18:22:39.985309 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.087122 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities\") pod \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.087230 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5gxs\" (UniqueName: \"kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs\") pod \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.087326 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content\") pod \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\" (UID: \"7c5a070c-5e61-450a-9f90-71f0b4d070a5\") " Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.089228 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities" (OuterVolumeSpecName: "utilities") pod "7c5a070c-5e61-450a-9f90-71f0b4d070a5" (UID: "7c5a070c-5e61-450a-9f90-71f0b4d070a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.093788 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_75f57bf0-1b74-4877-81b6-dfcbc355da4d/nova-scheduler-scheduler/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.096862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs" (OuterVolumeSpecName: "kube-api-access-v5gxs") pod "7c5a070c-5e61-450a-9f90-71f0b4d070a5" (UID: "7c5a070c-5e61-450a-9f90-71f0b4d070a5"). InnerVolumeSpecName "kube-api-access-v5gxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.114381 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c5a070c-5e61-450a-9f90-71f0b4d070a5" (UID: "7c5a070c-5e61-450a-9f90-71f0b4d070a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.172096 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-gdwjd_7a3f5007-c88c-446a-9543-71fe870e43e6/nova-cell1-openstack-openstack-cell1/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.176187 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7499f59567-gx45r_74caed9c-f767-4733-8a9e-8ffc8ed6bf1d/init/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.189873 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.189909 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5gxs\" (UniqueName: \"kubernetes.io/projected/7c5a070c-5e61-450a-9f90-71f0b4d070a5-kube-api-access-v5gxs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.189923 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5a070c-5e61-450a-9f90-71f0b4d070a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.445832 4939 generic.go:334] "Generic (PLEG): container finished" podID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerID="f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee" exitCode=0 Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.446041 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerDied","Data":"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee"} Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.446069 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqp7q" event={"ID":"7c5a070c-5e61-450a-9f90-71f0b4d070a5","Type":"ContainerDied","Data":"757f86a0a28649eb11a37e8b881f4afe97ed8d49ff68f382025641d74734014f"} Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.446085 4939 scope.go:117] "RemoveContainer" containerID="f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.446319 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqp7q" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.485420 4939 scope.go:117] "RemoveContainer" containerID="bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.498582 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.505753 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqp7q"] Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.566009 4939 scope.go:117] "RemoveContainer" containerID="1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.587445 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7499f59567-gx45r_74caed9c-f767-4733-8a9e-8ffc8ed6bf1d/init/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.615145 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7499f59567-gx45r_74caed9c-f767-4733-8a9e-8ffc8ed6bf1d/octavia-api-provider-agent/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.635476 4939 scope.go:117] "RemoveContainer" containerID="f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee" Mar 18 18:22:40 crc kubenswrapper[4939]: E0318 18:22:40.642773 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee\": container with ID starting with f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee not found: ID does not exist" containerID="f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.642827 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee"} err="failed to get container status \"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee\": rpc error: code = NotFound desc = could not find container \"f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee\": container with ID starting with f3890e1683317a409c25648c47cb75241eff9a76109dc3fcd05f26eb47e6ddee not found: ID does not exist" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.642855 4939 scope.go:117] "RemoveContainer" containerID="bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110" Mar 18 18:22:40 crc kubenswrapper[4939]: E0318 18:22:40.651785 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110\": container with ID starting with bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110 not found: ID does not exist" containerID="bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.651832 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110"} err="failed to get container status \"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110\": rpc error: code = NotFound desc = could not find container \"bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110\": container with ID starting with bb830ccf61673110018816542b8ddf4252774282322b750e56a6a1e71decf110 not found: ID does not exist" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.651864 4939 scope.go:117] "RemoveContainer" containerID="1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72" Mar 18 18:22:40 crc kubenswrapper[4939]: E0318 18:22:40.652111 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72\": container with ID starting with 1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72 not found: ID does not exist" containerID="1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.652135 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72"} err="failed to get container status \"1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72\": rpc error: code = NotFound desc = could not find container \"1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72\": container with ID starting with 1d454b1c2b3a38b576a72fc74c780833c41ddcf02d609de4c78a0f782c609f72 not found: ID does not exist" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.731244 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7499f59567-gx45r_74caed9c-f767-4733-8a9e-8ffc8ed6bf1d/octavia-api/0.log" Mar 18 18:22:40 crc kubenswrapper[4939]: I0318 18:22:40.837449 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-ngrvz_0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.106918 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-k2wvl_bbedb8e8-2834-4e62-97a8-8a9df6ab3091/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.125345 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-ngrvz_0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.162329 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-ngrvz_0c40ed0e-6b52-41b9-ac9a-ca77b88c3cef/octavia-healthmanager/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.458062 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-k2wvl_bbedb8e8-2834-4e62-97a8-8a9df6ab3091/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.485750 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-k2wvl_bbedb8e8-2834-4e62-97a8-8a9df6ab3091/octavia-housekeeping/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.500068 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-w8t75_5037cf17-2a06-4fa1-a218-1b12e963b853/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.713486 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-w8t75_5037cf17-2a06-4fa1-a218-1b12e963b853/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.736943 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-w8t75_5037cf17-2a06-4fa1-a218-1b12e963b853/octavia-amphora-httpd/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.776056 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-mfm6m_173ed316-31b6-4fe5-928d-d9f2f3d92f01/init/0.log" Mar 18 18:22:41 crc kubenswrapper[4939]: I0318 18:22:41.998087 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-mfm6m_173ed316-31b6-4fe5-928d-d9f2f3d92f01/init/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.055393 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-mfm6m_173ed316-31b6-4fe5-928d-d9f2f3d92f01/octavia-rsyslog/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.110076 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vhvzw_860e2b80-254e-4caa-813d-3e0b040d3798/init/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.148460 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" path="/var/lib/kubelet/pods/7c5a070c-5e61-450a-9f90-71f0b4d070a5/volumes" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.356025 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_339c0a2d-5c97-4d3b-84d7-a8731c708236/mysql-bootstrap/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.393688 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vhvzw_860e2b80-254e-4caa-813d-3e0b040d3798/init/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.519185 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vhvzw_860e2b80-254e-4caa-813d-3e0b040d3798/octavia-worker/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.577296 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_339c0a2d-5c97-4d3b-84d7-a8731c708236/mysql-bootstrap/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.686291 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_339c0a2d-5c97-4d3b-84d7-a8731c708236/galera/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.790837 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a3f6a5c2-628f-4a42-b8d6-59bd4535dba5/mysql-bootstrap/0.log" Mar 18 18:22:42 crc kubenswrapper[4939]: I0318 18:22:42.948016 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a3f6a5c2-628f-4a42-b8d6-59bd4535dba5/mysql-bootstrap/0.log" Mar 18 18:22:43 crc kubenswrapper[4939]: I0318 18:22:43.011282 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_da1de069-c69e-4219-b479-3f2f6cdbee7b/openstackclient/0.log" Mar 18 18:22:43 crc kubenswrapper[4939]: I0318 18:22:43.026977 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a3f6a5c2-628f-4a42-b8d6-59bd4535dba5/galera/0.log" Mar 18 18:22:43 crc kubenswrapper[4939]: I0318 18:22:43.834143 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-26h2t_c3acd6a2-80fc-4c56-b460-187b80f55cfb/ovn-controller/0.log" Mar 18 18:22:43 crc kubenswrapper[4939]: I0318 18:22:43.866298 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l6vq2_23757438-8e93-4d99-8df0-ca7e9f63f115/openstack-network-exporter/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.259097 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vnbc8_d4639e38-2435-415e-9cd8-d252c596ad22/ovsdb-server-init/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.502725 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vnbc8_d4639e38-2435-415e-9cd8-d252c596ad22/ovs-vswitchd/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.532458 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vnbc8_d4639e38-2435-415e-9cd8-d252c596ad22/ovsdb-server-init/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.584147 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vnbc8_d4639e38-2435-415e-9cd8-d252c596ad22/ovsdb-server/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.750325 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_60736420-d745-45a9-a045-700fcbfbbeff/ovn-northd/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.770873 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_60736420-d745-45a9-a045-700fcbfbbeff/openstack-network-exporter/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.968188 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-hpml8_6a1b5c03-995f-4ca7-bec5-622b83855a6c/ovn-openstack-openstack-cell1/0.log" Mar 18 18:22:44 crc kubenswrapper[4939]: I0318 18:22:44.986660 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc767c7b-3dfc-49d6-bfd0-310c15ec7369/openstack-network-exporter/0.log" Mar 18 18:22:45 crc kubenswrapper[4939]: I0318 18:22:45.049871 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc767c7b-3dfc-49d6-bfd0-310c15ec7369/ovsdbserver-nb/0.log" Mar 18 18:22:45 crc kubenswrapper[4939]: I0318 18:22:45.758355 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2118473b-6c74-4093-bba9-cc8ea59c632d/ovsdbserver-nb/0.log" Mar 18 18:22:45 crc kubenswrapper[4939]: I0318 18:22:45.786735 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2118473b-6c74-4093-bba9-cc8ea59c632d/openstack-network-exporter/0.log" Mar 18 18:22:45 crc kubenswrapper[4939]: I0318 18:22:45.992040 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3dc4642e-c9bd-42a3-81d5-010aee0538b9/openstack-network-exporter/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.024106 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3dc4642e-c9bd-42a3-81d5-010aee0538b9/ovsdbserver-nb/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.045257 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_377cdcbe-59e9-4a36-9528-a72cf359e07b/openstack-network-exporter/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.216516 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_377cdcbe-59e9-4a36-9528-a72cf359e07b/ovsdbserver-sb/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.287095 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef/ovsdbserver-sb/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.352687 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_7cb91c23-dd7e-4c3e-bf55-adf2a0a3c1ef/openstack-network-exporter/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.479087 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_4e6fca10-8968-4453-9789-4d81b241977e/openstack-network-exporter/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.549018 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_4e6fca10-8968-4453-9789-4d81b241977e/ovsdbserver-sb/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.747923 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c5dbfc5cb-x2dfn_fac1b0c6-6b39-4db2-9a6c-2429e3e17e52/placement-api/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.798069 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c5dbfc5cb-x2dfn_fac1b0c6-6b39-4db2-9a6c-2429e3e17e52/placement-log/0.log" Mar 18 18:22:46 crc kubenswrapper[4939]: I0318 18:22:46.904083 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-crlsrz_0aca3e76-93f3-4cee-9e08-63e1953f7e90/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.019476 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0993e3cb-7df8-4774-a056-425d5d6a5f35/init-config-reloader/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.233260 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0993e3cb-7df8-4774-a056-425d5d6a5f35/config-reloader/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.234858 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0993e3cb-7df8-4774-a056-425d5d6a5f35/prometheus/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.281924 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0993e3cb-7df8-4774-a056-425d5d6a5f35/init-config-reloader/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.293052 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0993e3cb-7df8-4774-a056-425d5d6a5f35/thanos-sidecar/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.426802 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e1594dd1-4c71-4f2d-9a6e-658ed43d121f/setup-container/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.622079 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c2e610ad-5be1-4333-b786-4883d87fedaf/memcached/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.623825 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e1594dd1-4c71-4f2d-9a6e-658ed43d121f/setup-container/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.669197 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e1594dd1-4c71-4f2d-9a6e-658ed43d121f/rabbitmq/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.720842 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_338caa35-af66-4082-bd19-e7b9d11e42de/setup-container/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.915991 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_338caa35-af66-4082-bd19-e7b9d11e42de/setup-container/0.log" Mar 18 18:22:47 crc kubenswrapper[4939]: I0318 18:22:47.996236 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-pc2wp_ad72a8b9-9736-4cc8-9e5a-1a99fa9c10ec/reboot-os-openstack-openstack-cell1/0.log" Mar 18 18:22:48 crc kubenswrapper[4939]: I0318 18:22:48.132979 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-8s4r2_0f5794a5-e3f0-4ef0-96e7-714f95fcd1c7/run-os-openstack-openstack-cell1/0.log" Mar 18 18:22:48 crc kubenswrapper[4939]: I0318 18:22:48.288438 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-64x5r_89d16573-9f34-4527-9799-f23dd1e42898/ssh-known-hosts-openstack/0.log" Mar 18 18:22:48 crc kubenswrapper[4939]: I0318 18:22:48.550234 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_338caa35-af66-4082-bd19-e7b9d11e42de/rabbitmq/0.log" Mar 18 18:22:48 crc kubenswrapper[4939]: I0318 18:22:48.708562 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-8zvhb_2d2a4c53-2fed-4077-b7cd-41720e50faa5/validate-network-openstack-openstack-cell1/0.log" Mar 18 18:22:48 crc kubenswrapper[4939]: I0318 18:22:48.804262 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-5jhdk_fb138a20-d324-472f-abff-090725f661d8/telemetry-openstack-openstack-cell1/0.log" Mar 18 18:22:49 crc kubenswrapper[4939]: I0318 18:22:49.785164 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-c9sb5_e8465fa8-f589-4135-b6bf-f278436d5326/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.356136 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/util/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.511884 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/pull/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.530287 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/util/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.564032 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/pull/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.781540 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/util/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.855168 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/extract/0.log" Mar 18 18:23:13 crc kubenswrapper[4939]: I0318 18:23:13.874198 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2ef3ba8b1c0b676adb6a3f124c7bc6cd0b8c8314c5b4826a6f3ff81a888hpsj_5422988b-65bc-4039-a7d4-6877eb1f0e65/pull/0.log" Mar 18 18:23:14 crc kubenswrapper[4939]: I0318 18:23:14.505459 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-pfmz6_dae9e528-b32e-4cf9-a032-2a3bc38320f6/manager/0.log" Mar 18 18:23:15 crc kubenswrapper[4939]: I0318 18:23:15.179956 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-xlmmb_ffe00762-ce07-4018-b8d6-290de61644d0/manager/0.log" Mar 18 18:23:15 crc kubenswrapper[4939]: I0318 18:23:15.490374 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-9mhmj_9dd46745-f668-4ec4-a61e-7ecf0f91bd8a/manager/0.log" Mar 18 18:23:15 crc kubenswrapper[4939]: I0318 18:23:15.538149 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-2w7ms_13e35677-42a2-48c1-86fc-f022708ac217/manager/0.log" Mar 18 18:23:15 crc kubenswrapper[4939]: I0318 18:23:15.813205 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-djxqp_e7737b63-d9f2-45e5-a1f5-b1300aff04a8/manager/0.log" Mar 18 18:23:16 crc kubenswrapper[4939]: I0318 18:23:16.213599 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-78b9l_d981a7ad-8442-4742-a3be-ca3667fe0f9f/manager/0.log" Mar 18 18:23:16 crc kubenswrapper[4939]: I0318 18:23:16.761496 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-992ww_54f258e9-7e39-422a-b848-2cbfc2633529/manager/0.log" Mar 18 18:23:16 crc kubenswrapper[4939]: I0318 18:23:16.764267 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-hwlg6_fe507f50-8673-4c7e-84e7-b649c6d61115/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.102948 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-bpt6p_e4983266-0ae5-493d-86a4-4f6b010e95d8/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.130065 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-xjg7j_23fb02fd-5d5a-488a-b3a7-82c69eb91e32/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.416731 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-6xvbd_089530fd-7678-4f52-9e8c-8af0f54a9d15/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.742792 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-hzsbq_5f476e97-8e28-48d0-9b5e-79c31bb8e9fd/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.786458 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-wmt6q_3f4b5f2d-578c-42f1-a283-960b728a9ef7/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.795053 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-wvfdv_c2cd123f-732e-4e44-b5f0-f79d15ea87da/manager/0.log" Mar 18 18:23:17 crc kubenswrapper[4939]: I0318 18:23:17.970369 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-scblh_573a3014-37a6-461d-a211-2fba3e06a72d/manager/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.095611 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7bc867c5bc-kdvmn_84d8418e-14e0-40b8-aae5-8bf2dbdce02b/operator/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.218853 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ng2rp_99136bd4-35be-493d-9913-3c569a8861c0/registry-server/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.494743 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-z2bvc_41644021-455b-493c-931f-9bc7fc0b8f8e/manager/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.529914 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-4xhw6_0bb8519c-c720-4853-a031-caedb2059b3d/manager/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.754568 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-52ml2_64de914e-d3d3-4b6e-b14e-6d02d2891539/operator/0.log" Mar 18 18:23:18 crc kubenswrapper[4939]: I0318 18:23:18.828533 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-lmrlz_62f8136e-6a5a-49ac-9c21-621384d4102f/manager/0.log" Mar 18 18:23:19 crc kubenswrapper[4939]: I0318 18:23:19.057268 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-5xlqd_c455d0aa-0b25-4684-997f-b7d2156c2ac8/manager/0.log" Mar 18 18:23:19 crc kubenswrapper[4939]: I0318 18:23:19.101630 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d84559f47-z9fgs_6cda5777-cac2-403f-bb77-44e76f6a0806/manager/0.log" Mar 18 18:23:19 crc kubenswrapper[4939]: I0318 18:23:19.321463 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-xd6pq_c5ffe79b-c31d-4a92-8006-aed78a9e1b16/manager/0.log" Mar 18 18:23:20 crc kubenswrapper[4939]: I0318 18:23:20.287669 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65fbdb4fdd-xrcnt_5718381a-312a-4dab-b896-c865eaa2232b/manager/0.log" Mar 18 18:23:23 crc kubenswrapper[4939]: I0318 18:23:23.687132 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:23:23 crc kubenswrapper[4939]: I0318 18:23:23.687434 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:23:41 crc kubenswrapper[4939]: I0318 18:23:41.684032 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6fdjn_134347ee-9f68-45f7-b66c-dc2493eac221/control-plane-machine-set-operator/0.log" Mar 18 18:23:41 crc kubenswrapper[4939]: I0318 18:23:41.854804 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5lh2v_366ffb79-635d-4f44-b22f-3fd5a77bd022/kube-rbac-proxy/0.log" Mar 18 18:23:41 crc kubenswrapper[4939]: I0318 18:23:41.873718 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5lh2v_366ffb79-635d-4f44-b22f-3fd5a77bd022/machine-api-operator/0.log" Mar 18 18:23:46 crc kubenswrapper[4939]: I0318 18:23:46.998428 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:23:47 crc kubenswrapper[4939]: E0318 18:23:46.999433 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="extract-utilities" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:46.999451 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="extract-utilities" Mar 18 18:23:47 crc kubenswrapper[4939]: E0318 18:23:46.999472 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="extract-content" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:46.999480 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="extract-content" Mar 18 18:23:47 crc kubenswrapper[4939]: E0318 18:23:46.999518 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="registry-server" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:46.999527 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="registry-server" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:46.999794 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5a070c-5e61-450a-9f90-71f0b4d070a5" containerName="registry-server" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.001801 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.024680 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.135223 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.135263 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.135340 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgk4\" (UniqueName: \"kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.237795 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.237859 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.237979 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgk4\" (UniqueName: \"kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.238264 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.238484 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.261549 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgk4\" (UniqueName: \"kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4\") pod \"redhat-operators-p26hb\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.334124 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:47 crc kubenswrapper[4939]: I0318 18:23:47.881285 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:23:47 crc kubenswrapper[4939]: W0318 18:23:47.889685 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47d7fc62_c655_46f8_ba3c_a403d462e37d.slice/crio-bd8dd85ce1811c1fbc96c0b382fa70ca11e8a0144febf4c19553622a8985d0d1 WatchSource:0}: Error finding container bd8dd85ce1811c1fbc96c0b382fa70ca11e8a0144febf4c19553622a8985d0d1: Status 404 returned error can't find the container with id bd8dd85ce1811c1fbc96c0b382fa70ca11e8a0144febf4c19553622a8985d0d1 Mar 18 18:23:48 crc kubenswrapper[4939]: I0318 18:23:48.134430 4939 generic.go:334] "Generic (PLEG): container finished" podID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerID="1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32" exitCode=0 Mar 18 18:23:48 crc kubenswrapper[4939]: I0318 18:23:48.136009 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:23:48 crc kubenswrapper[4939]: I0318 18:23:48.149046 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerDied","Data":"1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32"} Mar 18 18:23:48 crc kubenswrapper[4939]: I0318 18:23:48.149087 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerStarted","Data":"bd8dd85ce1811c1fbc96c0b382fa70ca11e8a0144febf4c19553622a8985d0d1"} Mar 18 18:23:50 crc kubenswrapper[4939]: I0318 18:23:50.197119 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerStarted","Data":"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7"} Mar 18 18:23:53 crc kubenswrapper[4939]: I0318 18:23:53.687386 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:23:53 crc kubenswrapper[4939]: I0318 18:23:53.687822 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:23:54 crc kubenswrapper[4939]: I0318 18:23:54.261933 4939 generic.go:334] "Generic (PLEG): container finished" podID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerID="400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7" exitCode=0 Mar 18 18:23:54 crc kubenswrapper[4939]: I0318 18:23:54.262216 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerDied","Data":"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7"} Mar 18 18:23:56 crc kubenswrapper[4939]: I0318 18:23:56.294267 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerStarted","Data":"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4"} Mar 18 18:23:56 crc kubenswrapper[4939]: I0318 18:23:56.331405 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p26hb" podStartSLOduration=3.435135387 podStartE2EDuration="10.331372364s" podCreationTimestamp="2026-03-18 18:23:46 +0000 UTC" firstStartedPulling="2026-03-18 18:23:48.135810045 +0000 UTC m=+9992.734997666" lastFinishedPulling="2026-03-18 18:23:55.032047012 +0000 UTC m=+9999.631234643" observedRunningTime="2026-03-18 18:23:56.318961923 +0000 UTC m=+10000.918149564" watchObservedRunningTime="2026-03-18 18:23:56.331372364 +0000 UTC m=+10000.930559995" Mar 18 18:23:57 crc kubenswrapper[4939]: I0318 18:23:57.335605 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:57 crc kubenswrapper[4939]: I0318 18:23:57.335659 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:23:58 crc kubenswrapper[4939]: I0318 18:23:58.533845 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p26hb" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" probeResult="failure" output=< Mar 18 18:23:58 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:23:58 crc kubenswrapper[4939]: > Mar 18 18:23:58 crc kubenswrapper[4939]: I0318 18:23:58.936323 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-6h7w7_37df594e-01f6-4d64-9b1c-1f1b1d36aecf/cert-manager-controller/0.log" Mar 18 18:23:59 crc kubenswrapper[4939]: I0318 18:23:59.199670 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-jwggk_02decf6c-b69f-4c7e-92ac-bcbe9aaaba30/cert-manager-cainjector/0.log" Mar 18 18:23:59 crc kubenswrapper[4939]: I0318 18:23:59.241108 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-pbm2c_145983f6-1dc1-4d35-88a6-577db0b7bd2b/cert-manager-webhook/0.log" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.177247 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564304-h25tv"] Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.178788 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-h25tv"] Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.178863 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.180730 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.181149 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.181297 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.324856 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtld9\" (UniqueName: \"kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9\") pod \"auto-csr-approver-29564304-h25tv\" (UID: \"9a62dda8-f284-4d5b-a36d-d8634b7912eb\") " pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.768310 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtld9\" (UniqueName: \"kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9\") pod \"auto-csr-approver-29564304-h25tv\" (UID: \"9a62dda8-f284-4d5b-a36d-d8634b7912eb\") " pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.798118 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtld9\" (UniqueName: \"kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9\") pod \"auto-csr-approver-29564304-h25tv\" (UID: \"9a62dda8-f284-4d5b-a36d-d8634b7912eb\") " pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:00 crc kubenswrapper[4939]: I0318 18:24:00.816657 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:01 crc kubenswrapper[4939]: I0318 18:24:01.307720 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-h25tv"] Mar 18 18:24:01 crc kubenswrapper[4939]: I0318 18:24:01.347888 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-h25tv" event={"ID":"9a62dda8-f284-4d5b-a36d-d8634b7912eb","Type":"ContainerStarted","Data":"db1a9fd3a56a009fe660418c4d2fd230fbbd28803022844757727e34749fcc20"} Mar 18 18:24:03 crc kubenswrapper[4939]: I0318 18:24:03.376870 4939 generic.go:334] "Generic (PLEG): container finished" podID="9a62dda8-f284-4d5b-a36d-d8634b7912eb" containerID="569ee6dad99e5177c152287fcd3b04c014e629a70cb0aeef4173afc1b6a5adf9" exitCode=0 Mar 18 18:24:03 crc kubenswrapper[4939]: I0318 18:24:03.377074 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-h25tv" event={"ID":"9a62dda8-f284-4d5b-a36d-d8634b7912eb","Type":"ContainerDied","Data":"569ee6dad99e5177c152287fcd3b04c014e629a70cb0aeef4173afc1b6a5adf9"} Mar 18 18:24:04 crc kubenswrapper[4939]: I0318 18:24:04.875578 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:04 crc kubenswrapper[4939]: I0318 18:24:04.964424 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtld9\" (UniqueName: \"kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9\") pod \"9a62dda8-f284-4d5b-a36d-d8634b7912eb\" (UID: \"9a62dda8-f284-4d5b-a36d-d8634b7912eb\") " Mar 18 18:24:04 crc kubenswrapper[4939]: I0318 18:24:04.970196 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9" (OuterVolumeSpecName: "kube-api-access-gtld9") pod "9a62dda8-f284-4d5b-a36d-d8634b7912eb" (UID: "9a62dda8-f284-4d5b-a36d-d8634b7912eb"). InnerVolumeSpecName "kube-api-access-gtld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.067321 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtld9\" (UniqueName: \"kubernetes.io/projected/9a62dda8-f284-4d5b-a36d-d8634b7912eb-kube-api-access-gtld9\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.403787 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-h25tv" event={"ID":"9a62dda8-f284-4d5b-a36d-d8634b7912eb","Type":"ContainerDied","Data":"db1a9fd3a56a009fe660418c4d2fd230fbbd28803022844757727e34749fcc20"} Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.403842 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db1a9fd3a56a009fe660418c4d2fd230fbbd28803022844757727e34749fcc20" Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.403928 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-h25tv" Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.947850 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-88lqw"] Mar 18 18:24:05 crc kubenswrapper[4939]: I0318 18:24:05.956552 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-88lqw"] Mar 18 18:24:06 crc kubenswrapper[4939]: I0318 18:24:06.146802 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd" path="/var/lib/kubelet/pods/4d89f18b-bf36-4dfe-83a8-cd0b1910bfcd/volumes" Mar 18 18:24:08 crc kubenswrapper[4939]: I0318 18:24:08.446983 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p26hb" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" probeResult="failure" output=< Mar 18 18:24:08 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:24:08 crc kubenswrapper[4939]: > Mar 18 18:24:14 crc kubenswrapper[4939]: I0318 18:24:14.707262 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-gxhtz_d5a366f8-31e3-4afd-81c7-bb78a39c5ded/nmstate-console-plugin/0.log" Mar 18 18:24:14 crc kubenswrapper[4939]: I0318 18:24:14.912756 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7jt69_0f293932-41a4-4e41-9fcd-019c097522ff/nmstate-handler/0.log" Mar 18 18:24:15 crc kubenswrapper[4939]: I0318 18:24:15.054658 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-q7vcf_457e3dd2-84a3-47b0-a023-affddd9bd954/kube-rbac-proxy/0.log" Mar 18 18:24:15 crc kubenswrapper[4939]: I0318 18:24:15.157672 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-q7vcf_457e3dd2-84a3-47b0-a023-affddd9bd954/nmstate-metrics/0.log" Mar 18 18:24:15 crc kubenswrapper[4939]: I0318 18:24:15.193072 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-24wzk_e9429399-6579-4e54-a804-31a2fda4e887/nmstate-operator/0.log" Mar 18 18:24:15 crc kubenswrapper[4939]: I0318 18:24:15.358373 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-k5fml_9c36ff55-800c-42e3-918e-f73cbd98e252/nmstate-webhook/0.log" Mar 18 18:24:15 crc kubenswrapper[4939]: I0318 18:24:15.968542 4939 scope.go:117] "RemoveContainer" containerID="b1676d2e9f024a0efae51218002bbd246ef9e1ffb5eedb6322d254d92c8671a7" Mar 18 18:24:18 crc kubenswrapper[4939]: I0318 18:24:18.403357 4939 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p26hb" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" probeResult="failure" output=< Mar 18 18:24:18 crc kubenswrapper[4939]: timeout: failed to connect service ":50051" within 1s Mar 18 18:24:18 crc kubenswrapper[4939]: > Mar 18 18:24:23 crc kubenswrapper[4939]: I0318 18:24:23.687343 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:24:23 crc kubenswrapper[4939]: I0318 18:24:23.687996 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:24:23 crc kubenswrapper[4939]: I0318 18:24:23.688054 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:24:23 crc kubenswrapper[4939]: I0318 18:24:23.689117 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:24:23 crc kubenswrapper[4939]: I0318 18:24:23.689177 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5" gracePeriod=600 Mar 18 18:24:24 crc kubenswrapper[4939]: I0318 18:24:24.604890 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5" exitCode=0 Mar 18 18:24:24 crc kubenswrapper[4939]: I0318 18:24:24.604920 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5"} Mar 18 18:24:24 crc kubenswrapper[4939]: I0318 18:24:24.605311 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerStarted","Data":"6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8"} Mar 18 18:24:24 crc kubenswrapper[4939]: I0318 18:24:24.605346 4939 scope.go:117] "RemoveContainer" containerID="fcee62683d1b36e5b89166e418e83e021972698881d5a2b02c908a8210364ecd" Mar 18 18:24:27 crc kubenswrapper[4939]: I0318 18:24:27.405383 4939 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:24:27 crc kubenswrapper[4939]: I0318 18:24:27.483032 4939 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:24:27 crc kubenswrapper[4939]: I0318 18:24:27.658226 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:24:28 crc kubenswrapper[4939]: I0318 18:24:28.651280 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p26hb" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" containerID="cri-o://5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4" gracePeriod=2 Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.233567 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.308950 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwgk4\" (UniqueName: \"kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4\") pod \"47d7fc62-c655-46f8-ba3c-a403d462e37d\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.309170 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities\") pod \"47d7fc62-c655-46f8-ba3c-a403d462e37d\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.309224 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content\") pod \"47d7fc62-c655-46f8-ba3c-a403d462e37d\" (UID: \"47d7fc62-c655-46f8-ba3c-a403d462e37d\") " Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.309960 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities" (OuterVolumeSpecName: "utilities") pod "47d7fc62-c655-46f8-ba3c-a403d462e37d" (UID: "47d7fc62-c655-46f8-ba3c-a403d462e37d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.316244 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4" (OuterVolumeSpecName: "kube-api-access-vwgk4") pod "47d7fc62-c655-46f8-ba3c-a403d462e37d" (UID: "47d7fc62-c655-46f8-ba3c-a403d462e37d"). InnerVolumeSpecName "kube-api-access-vwgk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.411490 4939 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.411545 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwgk4\" (UniqueName: \"kubernetes.io/projected/47d7fc62-c655-46f8-ba3c-a403d462e37d-kube-api-access-vwgk4\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.447240 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47d7fc62-c655-46f8-ba3c-a403d462e37d" (UID: "47d7fc62-c655-46f8-ba3c-a403d462e37d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.513532 4939 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7fc62-c655-46f8-ba3c-a403d462e37d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.671824 4939 generic.go:334] "Generic (PLEG): container finished" podID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerID="5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4" exitCode=0 Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.671872 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerDied","Data":"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4"} Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.671902 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p26hb" event={"ID":"47d7fc62-c655-46f8-ba3c-a403d462e37d","Type":"ContainerDied","Data":"bd8dd85ce1811c1fbc96c0b382fa70ca11e8a0144febf4c19553622a8985d0d1"} Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.671924 4939 scope.go:117] "RemoveContainer" containerID="5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.672075 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p26hb" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.700483 4939 scope.go:117] "RemoveContainer" containerID="400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.719546 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.729099 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p26hb"] Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.755236 4939 scope.go:117] "RemoveContainer" containerID="1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.807086 4939 scope.go:117] "RemoveContainer" containerID="5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4" Mar 18 18:24:29 crc kubenswrapper[4939]: E0318 18:24:29.807988 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4\": container with ID starting with 5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4 not found: ID does not exist" containerID="5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.808131 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4"} err="failed to get container status \"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4\": rpc error: code = NotFound desc = could not find container \"5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4\": container with ID starting with 5e9bb8fe226c6692f2ed9deb154c206adadd62c8d5880805e8c7e8c313c6bfb4 not found: ID does not exist" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.808253 4939 scope.go:117] "RemoveContainer" containerID="400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7" Mar 18 18:24:29 crc kubenswrapper[4939]: E0318 18:24:29.812731 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7\": container with ID starting with 400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7 not found: ID does not exist" containerID="400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.812944 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7"} err="failed to get container status \"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7\": rpc error: code = NotFound desc = could not find container \"400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7\": container with ID starting with 400b7e671c60ddc8cc0a315e26fc7335fb49f30119132b9f2129d2ed26cccda7 not found: ID does not exist" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.813087 4939 scope.go:117] "RemoveContainer" containerID="1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32" Mar 18 18:24:29 crc kubenswrapper[4939]: E0318 18:24:29.813727 4939 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32\": container with ID starting with 1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32 not found: ID does not exist" containerID="1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32" Mar 18 18:24:29 crc kubenswrapper[4939]: I0318 18:24:29.813858 4939 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32"} err="failed to get container status \"1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32\": rpc error: code = NotFound desc = could not find container \"1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32\": container with ID starting with 1b8759f862600c0f7cf39cb1dfabdb677e440642b30b81489f56c4dc5f6e9a32 not found: ID does not exist" Mar 18 18:24:30 crc kubenswrapper[4939]: I0318 18:24:30.144755 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" path="/var/lib/kubelet/pods/47d7fc62-c655-46f8-ba3c-a403d462e37d/volumes" Mar 18 18:24:31 crc kubenswrapper[4939]: I0318 18:24:31.516159 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-pxtrv_b807b3bb-f1f2-43ad-8a15-359bff856ca7/prometheus-operator/0.log" Mar 18 18:24:31 crc kubenswrapper[4939]: I0318 18:24:31.723992 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-599799b8-dk6hk_9ffde8c6-522e-491d-8f41-848c8a175529/prometheus-operator-admission-webhook/0.log" Mar 18 18:24:31 crc kubenswrapper[4939]: I0318 18:24:31.783524 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-599799b8-zhbrz_fa00393f-d482-4806-8ec6-e25a9f306888/prometheus-operator-admission-webhook/0.log" Mar 18 18:24:31 crc kubenswrapper[4939]: I0318 18:24:31.994870 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-zc9lq_cdcc74fa-7b06-489c-bdfc-fe75965f4aa3/operator/0.log" Mar 18 18:24:32 crc kubenswrapper[4939]: I0318 18:24:32.024165 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5d7759777-9ttbm_f2cfb2e8-f000-4606-8b03-9d82aebcc102/perses-operator/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.063657 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2mcg4_bddd9413-cc2e-4e90-86b1-132dac143d2f/kube-rbac-proxy/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.237143 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-frr-files/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.451219 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2mcg4_bddd9413-cc2e-4e90-86b1-132dac143d2f/controller/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.545471 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-reloader/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.554933 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-frr-files/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.777010 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-metrics/0.log" Mar 18 18:24:51 crc kubenswrapper[4939]: I0318 18:24:51.903948 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-reloader/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.054793 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-frr-files/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.066562 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-metrics/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.067677 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-reloader/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.117659 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-metrics/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.296169 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-metrics/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.302126 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-frr-files/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.316920 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/cp-reloader/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.318967 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/controller/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.463559 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/frr-metrics/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.509210 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/kube-rbac-proxy-frr/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.537408 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/kube-rbac-proxy/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.776282 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/reloader/0.log" Mar 18 18:24:52 crc kubenswrapper[4939]: I0318 18:24:52.842223 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-mfplf_dffd9132-ad62-4775-926e-c604222dab03/frr-k8s-webhook-server/0.log" Mar 18 18:24:53 crc kubenswrapper[4939]: I0318 18:24:53.037586 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5f4f95c589-d4mxl_50ba9fec-3821-4c3d-b45d-560924327333/manager/0.log" Mar 18 18:24:53 crc kubenswrapper[4939]: I0318 18:24:53.295111 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c79d7b54b-47sks_cff68ac1-e2a3-4bbe-b617-0f8bf923e0e7/webhook-server/0.log" Mar 18 18:24:53 crc kubenswrapper[4939]: I0318 18:24:53.433328 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jn4nh_aa0441fc-cc4b-4e64-ad17-79b9760c51cb/kube-rbac-proxy/0.log" Mar 18 18:24:54 crc kubenswrapper[4939]: I0318 18:24:54.467866 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jn4nh_aa0441fc-cc4b-4e64-ad17-79b9760c51cb/speaker/0.log" Mar 18 18:24:56 crc kubenswrapper[4939]: I0318 18:24:56.170377 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62xrf_06ab61f3-deb8-4e62-b8a2-f5b879ab398b/frr/0.log" Mar 18 18:25:01 crc kubenswrapper[4939]: I0318 18:25:01.257187 4939 patch_prober.go:28] interesting pod/console-operator-58897d9998-dcfh2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:25:01 crc kubenswrapper[4939]: I0318 18:25:01.257830 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dcfh2" podUID="4306eb36-80d5-404b-8909-dd446ee88230" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:10 crc kubenswrapper[4939]: I0318 18:25:10.621471 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/util/0.log" Mar 18 18:25:10 crc kubenswrapper[4939]: I0318 18:25:10.830908 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/util/0.log" Mar 18 18:25:10 crc kubenswrapper[4939]: I0318 18:25:10.850015 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/pull/0.log" Mar 18 18:25:10 crc kubenswrapper[4939]: I0318 18:25:10.855845 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/pull/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.058024 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/util/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.083924 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/pull/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.142768 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874d9jpn_fed45129-6967-491d-9458-9480359e655d/extract/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.260317 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/util/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.452706 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/util/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.481229 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/pull/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.511914 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/pull/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.655249 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/extract/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.714672 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/pull/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.735005 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pgwgq_b7aa7c31-17c0-4b52-a694-bb74e34749a3/util/0.log" Mar 18 18:25:11 crc kubenswrapper[4939]: I0318 18:25:11.853367 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.038163 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/pull/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.065790 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.097386 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/pull/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.228557 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.277903 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/extract/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.333932 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55h2kp_0460d1a9-3710-482c-89c5-00dd5c28da89/pull/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.416899 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.621168 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.644897 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/pull/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.659748 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/pull/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.821797 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/util/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.834995 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/extract/0.log" Mar 18 18:25:12 crc kubenswrapper[4939]: I0318 18:25:12.883221 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726dq95k_17abc20c-de7b-4ffd-a986-918dbb8cd4dd/pull/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.011568 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-utilities/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.166289 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-content/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.216326 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-content/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.228877 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-utilities/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.458791 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-utilities/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.463550 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/extract-content/0.log" Mar 18 18:25:13 crc kubenswrapper[4939]: I0318 18:25:13.715032 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-utilities/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.060352 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-content/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.130996 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-utilities/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.163145 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-content/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.494216 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-utilities/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.536680 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/extract-content/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.745615 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6p9v6_00562936-861a-4e78-b01d-35b9ae9a8b2a/marketplace-operator/0.log" Mar 18 18:25:14 crc kubenswrapper[4939]: I0318 18:25:14.956421 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-utilities/0.log" Mar 18 18:25:15 crc kubenswrapper[4939]: I0318 18:25:15.412890 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nn6nx_5e2f4fd1-4ecf-4deb-9a6c-1de07faf8a2b/registry-server/0.log" Mar 18 18:25:15 crc kubenswrapper[4939]: I0318 18:25:15.885786 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-utilities/0.log" Mar 18 18:25:15 crc kubenswrapper[4939]: I0318 18:25:15.912144 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-content/0.log" Mar 18 18:25:15 crc kubenswrapper[4939]: I0318 18:25:15.942492 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-content/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.123677 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-content/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.161890 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/extract-utilities/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.387039 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-utilities/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.542292 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-content/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.572739 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-utilities/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.632659 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-content/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.655037 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ch9n8_144cb12d-acd2-4981-ac55-e3ae8682cec6/registry-server/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.679423 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-d2cwz_1264055e-c4dc-4675-a79e-2b158edd8733/registry-server/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.782259 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-utilities/0.log" Mar 18 18:25:16 crc kubenswrapper[4939]: I0318 18:25:16.797992 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/extract-content/0.log" Mar 18 18:25:17 crc kubenswrapper[4939]: I0318 18:25:17.082312 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c79c8_66fe6e33-3cf9-44d8-afcc-d9b6497e0d5f/registry-server/0.log" Mar 18 18:25:30 crc kubenswrapper[4939]: I0318 18:25:30.767206 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-599799b8-dk6hk_9ffde8c6-522e-491d-8f41-848c8a175529/prometheus-operator-admission-webhook/0.log" Mar 18 18:25:30 crc kubenswrapper[4939]: I0318 18:25:30.770800 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-pxtrv_b807b3bb-f1f2-43ad-8a15-359bff856ca7/prometheus-operator/0.log" Mar 18 18:25:30 crc kubenswrapper[4939]: I0318 18:25:30.846150 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-599799b8-zhbrz_fa00393f-d482-4806-8ec6-e25a9f306888/prometheus-operator-admission-webhook/0.log" Mar 18 18:25:30 crc kubenswrapper[4939]: I0318 18:25:30.967079 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-zc9lq_cdcc74fa-7b06-489c-bdfc-fe75965f4aa3/operator/0.log" Mar 18 18:25:30 crc kubenswrapper[4939]: I0318 18:25:30.969662 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5d7759777-9ttbm_f2cfb2e8-f000-4606-8b03-9d82aebcc102/perses-operator/0.log" Mar 18 18:25:51 crc kubenswrapper[4939]: E0318 18:25:51.515126 4939 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.227:45040->38.102.83.227:41597: read tcp 38.102.83.227:45040->38.102.83.227:41597: read: connection reset by peer Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.169366 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564306-fmzvs"] Mar 18 18:26:00 crc kubenswrapper[4939]: E0318 18:26:00.170572 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="extract-content" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.170595 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="extract-content" Mar 18 18:26:00 crc kubenswrapper[4939]: E0318 18:26:00.170655 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="extract-utilities" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.170672 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="extract-utilities" Mar 18 18:26:00 crc kubenswrapper[4939]: E0318 18:26:00.170705 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.170715 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4939]: E0318 18:26:00.170739 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a62dda8-f284-4d5b-a36d-d8634b7912eb" containerName="oc" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.170749 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a62dda8-f284-4d5b-a36d-d8634b7912eb" containerName="oc" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.171075 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a62dda8-f284-4d5b-a36d-d8634b7912eb" containerName="oc" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.171107 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d7fc62-c655-46f8-ba3c-a403d462e37d" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.172674 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.177020 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-fmzvs"] Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.183868 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.184052 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.184199 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.302273 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-765z5\" (UniqueName: \"kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5\") pod \"auto-csr-approver-29564306-fmzvs\" (UID: \"df9b5ada-0ceb-4067-aeac-9de57a41345d\") " pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.406162 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-765z5\" (UniqueName: \"kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5\") pod \"auto-csr-approver-29564306-fmzvs\" (UID: \"df9b5ada-0ceb-4067-aeac-9de57a41345d\") " pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.432148 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-765z5\" (UniqueName: \"kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5\") pod \"auto-csr-approver-29564306-fmzvs\" (UID: \"df9b5ada-0ceb-4067-aeac-9de57a41345d\") " pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:00 crc kubenswrapper[4939]: I0318 18:26:00.501385 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:01 crc kubenswrapper[4939]: I0318 18:26:01.023346 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-fmzvs"] Mar 18 18:26:01 crc kubenswrapper[4939]: I0318 18:26:01.976460 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" event={"ID":"df9b5ada-0ceb-4067-aeac-9de57a41345d","Type":"ContainerStarted","Data":"6420ab1877608f5d9f806c5a3c33199ee7b9a689ba9030dcb7105489785962e4"} Mar 18 18:26:04 crc kubenswrapper[4939]: I0318 18:26:04.004128 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" event={"ID":"df9b5ada-0ceb-4067-aeac-9de57a41345d","Type":"ContainerStarted","Data":"69fefbf4a69d3bb1797ff3490c4307fc77599971a9aab7d35cab000876863088"} Mar 18 18:26:04 crc kubenswrapper[4939]: I0318 18:26:04.036290 4939 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" podStartSLOduration=2.10490688 podStartE2EDuration="4.036266456s" podCreationTimestamp="2026-03-18 18:26:00 +0000 UTC" firstStartedPulling="2026-03-18 18:26:01.034590016 +0000 UTC m=+10125.633777637" lastFinishedPulling="2026-03-18 18:26:02.965949582 +0000 UTC m=+10127.565137213" observedRunningTime="2026-03-18 18:26:04.02792485 +0000 UTC m=+10128.627112481" watchObservedRunningTime="2026-03-18 18:26:04.036266456 +0000 UTC m=+10128.635454077" Mar 18 18:26:05 crc kubenswrapper[4939]: I0318 18:26:05.024634 4939 generic.go:334] "Generic (PLEG): container finished" podID="df9b5ada-0ceb-4067-aeac-9de57a41345d" containerID="69fefbf4a69d3bb1797ff3490c4307fc77599971a9aab7d35cab000876863088" exitCode=0 Mar 18 18:26:05 crc kubenswrapper[4939]: I0318 18:26:05.024762 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" event={"ID":"df9b5ada-0ceb-4067-aeac-9de57a41345d","Type":"ContainerDied","Data":"69fefbf4a69d3bb1797ff3490c4307fc77599971a9aab7d35cab000876863088"} Mar 18 18:26:06 crc kubenswrapper[4939]: I0318 18:26:06.468977 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:06 crc kubenswrapper[4939]: I0318 18:26:06.587130 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-765z5\" (UniqueName: \"kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5\") pod \"df9b5ada-0ceb-4067-aeac-9de57a41345d\" (UID: \"df9b5ada-0ceb-4067-aeac-9de57a41345d\") " Mar 18 18:26:06 crc kubenswrapper[4939]: I0318 18:26:06.601862 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5" (OuterVolumeSpecName: "kube-api-access-765z5") pod "df9b5ada-0ceb-4067-aeac-9de57a41345d" (UID: "df9b5ada-0ceb-4067-aeac-9de57a41345d"). InnerVolumeSpecName "kube-api-access-765z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:06 crc kubenswrapper[4939]: I0318 18:26:06.690666 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-765z5\" (UniqueName: \"kubernetes.io/projected/df9b5ada-0ceb-4067-aeac-9de57a41345d-kube-api-access-765z5\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:07 crc kubenswrapper[4939]: I0318 18:26:07.059471 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" event={"ID":"df9b5ada-0ceb-4067-aeac-9de57a41345d","Type":"ContainerDied","Data":"6420ab1877608f5d9f806c5a3c33199ee7b9a689ba9030dcb7105489785962e4"} Mar 18 18:26:07 crc kubenswrapper[4939]: I0318 18:26:07.059556 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6420ab1877608f5d9f806c5a3c33199ee7b9a689ba9030dcb7105489785962e4" Mar 18 18:26:07 crc kubenswrapper[4939]: I0318 18:26:07.059880 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-fmzvs" Mar 18 18:26:07 crc kubenswrapper[4939]: I0318 18:26:07.140300 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-59nrl"] Mar 18 18:26:07 crc kubenswrapper[4939]: I0318 18:26:07.153149 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-59nrl"] Mar 18 18:26:08 crc kubenswrapper[4939]: I0318 18:26:08.156121 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13033ead-e00f-4743-a3cb-67c9eddcc12c" path="/var/lib/kubelet/pods/13033ead-e00f-4743-a3cb-67c9eddcc12c/volumes" Mar 18 18:26:16 crc kubenswrapper[4939]: I0318 18:26:16.086152 4939 scope.go:117] "RemoveContainer" containerID="2eb42162f75bc63d08b7f99a84709dd9e2bd8585bcd5e71fd37d741c7e001280" Mar 18 18:26:50 crc kubenswrapper[4939]: I0318 18:26:50.771525 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="339c0a2d-5c97-4d3b-84d7-a8731c708236" containerName="galera" probeResult="failure" output="command timed out" Mar 18 18:26:50 crc kubenswrapper[4939]: I0318 18:26:50.772941 4939 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="339c0a2d-5c97-4d3b-84d7-a8731c708236" containerName="galera" probeResult="failure" output="command timed out" Mar 18 18:26:53 crc kubenswrapper[4939]: I0318 18:26:53.687971 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:26:53 crc kubenswrapper[4939]: I0318 18:26:53.688736 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:27:16 crc kubenswrapper[4939]: I0318 18:27:16.203985 4939 scope.go:117] "RemoveContainer" containerID="bb4d0e980483ea36cde8eac59bb0e55b71e475301f060abbbd01c4a213ce6032" Mar 18 18:27:16 crc kubenswrapper[4939]: I0318 18:27:16.897430 4939 scope.go:117] "RemoveContainer" containerID="f3d468d1cd6c45677bb4bb8b0cac04fcd4ca2c5b21120a39998afb6b560e0637" Mar 18 18:27:23 crc kubenswrapper[4939]: I0318 18:27:23.687119 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:27:23 crc kubenswrapper[4939]: I0318 18:27:23.688071 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:27:38 crc kubenswrapper[4939]: I0318 18:27:38.265081 4939 generic.go:334] "Generic (PLEG): container finished" podID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerID="bbe8217c6ce60546263025f35f06d14756fd9a5c6579e75bfc91a15ebfc16ae0" exitCode=0 Mar 18 18:27:38 crc kubenswrapper[4939]: I0318 18:27:38.265750 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" event={"ID":"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0","Type":"ContainerDied","Data":"bbe8217c6ce60546263025f35f06d14756fd9a5c6579e75bfc91a15ebfc16ae0"} Mar 18 18:27:38 crc kubenswrapper[4939]: I0318 18:27:38.269449 4939 scope.go:117] "RemoveContainer" containerID="bbe8217c6ce60546263025f35f06d14756fd9a5c6579e75bfc91a15ebfc16ae0" Mar 18 18:27:39 crc kubenswrapper[4939]: I0318 18:27:39.316210 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfhwl_must-gather-r8cqx_47c5a6e4-ee41-45a7-97a8-9d9dbec839a0/gather/0.log" Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.229872 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfhwl/must-gather-r8cqx"] Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.230669 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="copy" containerID="cri-o://bb26d3a6ff8353baa6ac13d1aaee17cb90f07ec66b9a6b7c2ae768afd4dd2775" gracePeriod=2 Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.244589 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfhwl/must-gather-r8cqx"] Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.419315 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfhwl_must-gather-r8cqx_47c5a6e4-ee41-45a7-97a8-9d9dbec839a0/copy/0.log" Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.420460 4939 generic.go:334] "Generic (PLEG): container finished" podID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerID="bb26d3a6ff8353baa6ac13d1aaee17cb90f07ec66b9a6b7c2ae768afd4dd2775" exitCode=143 Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.692614 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfhwl_must-gather-r8cqx_47c5a6e4-ee41-45a7-97a8-9d9dbec839a0/copy/0.log" Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.693020 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.769616 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm\") pod \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.769913 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output\") pod \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\" (UID: \"47c5a6e4-ee41-45a7-97a8-9d9dbec839a0\") " Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.781738 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm" (OuterVolumeSpecName: "kube-api-access-k8flm") pod "47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" (UID: "47c5a6e4-ee41-45a7-97a8-9d9dbec839a0"). InnerVolumeSpecName "kube-api-access-k8flm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:49 crc kubenswrapper[4939]: I0318 18:27:49.872459 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-kube-api-access-k8flm\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.005472 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" (UID: "47c5a6e4-ee41-45a7-97a8-9d9dbec839a0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.076478 4939 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.142266 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" path="/var/lib/kubelet/pods/47c5a6e4-ee41-45a7-97a8-9d9dbec839a0/volumes" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.431686 4939 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfhwl_must-gather-r8cqx_47c5a6e4-ee41-45a7-97a8-9d9dbec839a0/copy/0.log" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.432042 4939 scope.go:117] "RemoveContainer" containerID="bb26d3a6ff8353baa6ac13d1aaee17cb90f07ec66b9a6b7c2ae768afd4dd2775" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.432200 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfhwl/must-gather-r8cqx" Mar 18 18:27:50 crc kubenswrapper[4939]: I0318 18:27:50.452639 4939 scope.go:117] "RemoveContainer" containerID="bbe8217c6ce60546263025f35f06d14756fd9a5c6579e75bfc91a15ebfc16ae0" Mar 18 18:27:53 crc kubenswrapper[4939]: I0318 18:27:53.687473 4939 patch_prober.go:28] interesting pod/machine-config-daemon-6q7lf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:27:53 crc kubenswrapper[4939]: I0318 18:27:53.688188 4939 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:27:53 crc kubenswrapper[4939]: I0318 18:27:53.688235 4939 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" Mar 18 18:27:53 crc kubenswrapper[4939]: I0318 18:27:53.689073 4939 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8"} pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:27:53 crc kubenswrapper[4939]: I0318 18:27:53.689139 4939 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerName="machine-config-daemon" containerID="cri-o://6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" gracePeriod=600 Mar 18 18:27:53 crc kubenswrapper[4939]: E0318 18:27:53.824843 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:27:54 crc kubenswrapper[4939]: I0318 18:27:54.493991 4939 generic.go:334] "Generic (PLEG): container finished" podID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" exitCode=0 Mar 18 18:27:54 crc kubenswrapper[4939]: I0318 18:27:54.494161 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" event={"ID":"a32d41a6-8ebb-4871-b660-91407cbaa5c5","Type":"ContainerDied","Data":"6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8"} Mar 18 18:27:54 crc kubenswrapper[4939]: I0318 18:27:54.494467 4939 scope.go:117] "RemoveContainer" containerID="cd5a8b2df6c9c9a32e5accb27af0b355fd1544708ba4c93ea25fe817def4fbc5" Mar 18 18:27:54 crc kubenswrapper[4939]: I0318 18:27:54.495565 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:27:54 crc kubenswrapper[4939]: E0318 18:27:54.496042 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.184078 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564308-nmcp5"] Mar 18 18:28:00 crc kubenswrapper[4939]: E0318 18:28:00.185036 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9b5ada-0ceb-4067-aeac-9de57a41345d" containerName="oc" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185051 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b5ada-0ceb-4067-aeac-9de57a41345d" containerName="oc" Mar 18 18:28:00 crc kubenswrapper[4939]: E0318 18:28:00.185075 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="copy" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185084 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="copy" Mar 18 18:28:00 crc kubenswrapper[4939]: E0318 18:28:00.185107 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="gather" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185116 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="gather" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185357 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="gather" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185387 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9b5ada-0ceb-4067-aeac-9de57a41345d" containerName="oc" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.185408 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c5a6e4-ee41-45a7-97a8-9d9dbec839a0" containerName="copy" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.186227 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.188627 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.188827 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.193305 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.207957 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-nmcp5"] Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.331166 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pb9r\" (UniqueName: \"kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r\") pod \"auto-csr-approver-29564308-nmcp5\" (UID: \"3cea8b6b-361f-4a6d-b298-3e7618043454\") " pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.509574 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pb9r\" (UniqueName: \"kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r\") pod \"auto-csr-approver-29564308-nmcp5\" (UID: \"3cea8b6b-361f-4a6d-b298-3e7618043454\") " pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.540084 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pb9r\" (UniqueName: \"kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r\") pod \"auto-csr-approver-29564308-nmcp5\" (UID: \"3cea8b6b-361f-4a6d-b298-3e7618043454\") " pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:00 crc kubenswrapper[4939]: I0318 18:28:00.834376 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:01 crc kubenswrapper[4939]: W0318 18:28:01.380372 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cea8b6b_361f_4a6d_b298_3e7618043454.slice/crio-b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c WatchSource:0}: Error finding container b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c: Status 404 returned error can't find the container with id b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c Mar 18 18:28:01 crc kubenswrapper[4939]: I0318 18:28:01.386675 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-nmcp5"] Mar 18 18:28:01 crc kubenswrapper[4939]: I0318 18:28:01.576491 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" event={"ID":"3cea8b6b-361f-4a6d-b298-3e7618043454","Type":"ContainerStarted","Data":"b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c"} Mar 18 18:28:03 crc kubenswrapper[4939]: I0318 18:28:03.607654 4939 generic.go:334] "Generic (PLEG): container finished" podID="3cea8b6b-361f-4a6d-b298-3e7618043454" containerID="f51064a568599dc32fb74ad8949c07a6badcdf5f1b5bbc4927462ec272323de7" exitCode=0 Mar 18 18:28:03 crc kubenswrapper[4939]: I0318 18:28:03.607753 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" event={"ID":"3cea8b6b-361f-4a6d-b298-3e7618043454","Type":"ContainerDied","Data":"f51064a568599dc32fb74ad8949c07a6badcdf5f1b5bbc4927462ec272323de7"} Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.116104 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.242019 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pb9r\" (UniqueName: \"kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r\") pod \"3cea8b6b-361f-4a6d-b298-3e7618043454\" (UID: \"3cea8b6b-361f-4a6d-b298-3e7618043454\") " Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.256904 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r" (OuterVolumeSpecName: "kube-api-access-6pb9r") pod "3cea8b6b-361f-4a6d-b298-3e7618043454" (UID: "3cea8b6b-361f-4a6d-b298-3e7618043454"). InnerVolumeSpecName "kube-api-access-6pb9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.345119 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pb9r\" (UniqueName: \"kubernetes.io/projected/3cea8b6b-361f-4a6d-b298-3e7618043454-kube-api-access-6pb9r\") on node \"crc\" DevicePath \"\"" Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.633254 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" event={"ID":"3cea8b6b-361f-4a6d-b298-3e7618043454","Type":"ContainerDied","Data":"b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c"} Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.633299 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b79c4b62b5f568d38474097aadb40c6fc8a0190690671c3da5ec443b25e1ad4c" Mar 18 18:28:05 crc kubenswrapper[4939]: I0318 18:28:05.633357 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-nmcp5" Mar 18 18:28:06 crc kubenswrapper[4939]: I0318 18:28:06.207969 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-sfgst"] Mar 18 18:28:06 crc kubenswrapper[4939]: I0318 18:28:06.217125 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-sfgst"] Mar 18 18:28:08 crc kubenswrapper[4939]: I0318 18:28:08.157261 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a734a071-0c7a-48a7-a272-dbcef14a572d" path="/var/lib/kubelet/pods/a734a071-0c7a-48a7-a272-dbcef14a572d/volumes" Mar 18 18:28:10 crc kubenswrapper[4939]: I0318 18:28:10.134428 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:28:10 crc kubenswrapper[4939]: E0318 18:28:10.135953 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:28:17 crc kubenswrapper[4939]: I0318 18:28:17.072306 4939 scope.go:117] "RemoveContainer" containerID="d0d48c3f04f0111b7a996bb40bbf4f8bd1f9a34184fb614148ae886e44b78958" Mar 18 18:28:17 crc kubenswrapper[4939]: I0318 18:28:17.138496 4939 scope.go:117] "RemoveContainer" containerID="31a51509719451dcd6895d65ee33fca20c461bbc88af60f0697362643fb8f1a0" Mar 18 18:28:22 crc kubenswrapper[4939]: I0318 18:28:22.134388 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:28:22 crc kubenswrapper[4939]: E0318 18:28:22.135681 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:28:35 crc kubenswrapper[4939]: I0318 18:28:35.133272 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:28:35 crc kubenswrapper[4939]: E0318 18:28:35.134061 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:28:46 crc kubenswrapper[4939]: I0318 18:28:46.150657 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:28:46 crc kubenswrapper[4939]: E0318 18:28:46.151919 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:28:58 crc kubenswrapper[4939]: I0318 18:28:58.133474 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:28:58 crc kubenswrapper[4939]: E0318 18:28:58.134389 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:29:13 crc kubenswrapper[4939]: I0318 18:29:13.133430 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:29:13 crc kubenswrapper[4939]: E0318 18:29:13.134409 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:29:25 crc kubenswrapper[4939]: I0318 18:29:25.134646 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:29:25 crc kubenswrapper[4939]: E0318 18:29:25.136279 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:29:38 crc kubenswrapper[4939]: I0318 18:29:38.134149 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:29:38 crc kubenswrapper[4939]: E0318 18:29:38.135151 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:29:50 crc kubenswrapper[4939]: I0318 18:29:50.134306 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:29:50 crc kubenswrapper[4939]: E0318 18:29:50.135140 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.163030 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564310-v2qpj"] Mar 18 18:30:00 crc kubenswrapper[4939]: E0318 18:30:00.164560 4939 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cea8b6b-361f-4a6d-b298-3e7618043454" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.164584 4939 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cea8b6b-361f-4a6d-b298-3e7618043454" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.165007 4939 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cea8b6b-361f-4a6d-b298-3e7618043454" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.166217 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.168490 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.171946 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-sjlfk" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.172758 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.179342 4939 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc"] Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.180824 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.182956 4939 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.183084 4939 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.184776 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.185007 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvmk\" (UniqueName: \"kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk\") pod \"auto-csr-approver-29564310-v2qpj\" (UID: \"f71d344d-2b65-4b38-933d-6b1be71046a7\") " pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.185142 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vbc\" (UniqueName: \"kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.185566 4939 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.210428 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-v2qpj"] Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.225957 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc"] Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.287770 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.287887 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvmk\" (UniqueName: \"kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk\") pod \"auto-csr-approver-29564310-v2qpj\" (UID: \"f71d344d-2b65-4b38-933d-6b1be71046a7\") " pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.287931 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vbc\" (UniqueName: \"kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.288022 4939 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.290056 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.307264 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.310491 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vbc\" (UniqueName: \"kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc\") pod \"collect-profiles-29564310-lqgbc\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.313411 4939 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvmk\" (UniqueName: \"kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk\") pod \"auto-csr-approver-29564310-v2qpj\" (UID: \"f71d344d-2b65-4b38-933d-6b1be71046a7\") " pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.508687 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:00 crc kubenswrapper[4939]: I0318 18:30:00.818132 4939 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:01 crc kubenswrapper[4939]: I0318 18:30:01.258136 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-v2qpj"] Mar 18 18:30:01 crc kubenswrapper[4939]: W0318 18:30:01.267373 4939 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71d344d_2b65_4b38_933d_6b1be71046a7.slice/crio-419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96 WatchSource:0}: Error finding container 419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96: Status 404 returned error can't find the container with id 419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96 Mar 18 18:30:01 crc kubenswrapper[4939]: I0318 18:30:01.270854 4939 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:30:01 crc kubenswrapper[4939]: I0318 18:30:01.405547 4939 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc"] Mar 18 18:30:02 crc kubenswrapper[4939]: I0318 18:30:02.133555 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:30:02 crc kubenswrapper[4939]: E0318 18:30:02.133861 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:30:02 crc kubenswrapper[4939]: I0318 18:30:02.192309 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" event={"ID":"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1","Type":"ContainerStarted","Data":"c276cdefe53e1abca80e887bdfba1bb501970173aa3c3791eb822f1a253ca8cf"} Mar 18 18:30:02 crc kubenswrapper[4939]: I0318 18:30:02.195918 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" event={"ID":"f71d344d-2b65-4b38-933d-6b1be71046a7","Type":"ContainerStarted","Data":"419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96"} Mar 18 18:30:03 crc kubenswrapper[4939]: I0318 18:30:03.206561 4939 generic.go:334] "Generic (PLEG): container finished" podID="ab3b2f84-ffda-437b-95a1-a9a1d1577bf1" containerID="077597ffb2355b802371bc1d33c74d5534bd7f05ae453d50967285f85854714a" exitCode=0 Mar 18 18:30:03 crc kubenswrapper[4939]: I0318 18:30:03.206674 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" event={"ID":"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1","Type":"ContainerDied","Data":"077597ffb2355b802371bc1d33c74d5534bd7f05ae453d50967285f85854714a"} Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.696244 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.791083 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume\") pod \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.791161 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87vbc\" (UniqueName: \"kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc\") pod \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.791303 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume\") pod \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\" (UID: \"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1\") " Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.792529 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1" (UID: "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.798203 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc" (OuterVolumeSpecName: "kube-api-access-87vbc") pod "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1" (UID: "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1"). InnerVolumeSpecName "kube-api-access-87vbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.806266 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1" (UID: "ab3b2f84-ffda-437b-95a1-a9a1d1577bf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.893395 4939 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.893438 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87vbc\" (UniqueName: \"kubernetes.io/projected/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-kube-api-access-87vbc\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:04 crc kubenswrapper[4939]: I0318 18:30:04.893450 4939 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab3b2f84-ffda-437b-95a1-a9a1d1577bf1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.238324 4939 generic.go:334] "Generic (PLEG): container finished" podID="f71d344d-2b65-4b38-933d-6b1be71046a7" containerID="5c342da0006ed3733e37b7ba11ab04705a577071ad14ddf4216c05d6aa95a76f" exitCode=0 Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.238420 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" event={"ID":"f71d344d-2b65-4b38-933d-6b1be71046a7","Type":"ContainerDied","Data":"5c342da0006ed3733e37b7ba11ab04705a577071ad14ddf4216c05d6aa95a76f"} Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.241783 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" event={"ID":"ab3b2f84-ffda-437b-95a1-a9a1d1577bf1","Type":"ContainerDied","Data":"c276cdefe53e1abca80e887bdfba1bb501970173aa3c3791eb822f1a253ca8cf"} Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.241833 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c276cdefe53e1abca80e887bdfba1bb501970173aa3c3791eb822f1a253ca8cf" Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.241839 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lqgbc" Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.788080 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7"] Mar 18 18:30:05 crc kubenswrapper[4939]: I0318 18:30:05.800486 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564265-k4tj7"] Mar 18 18:30:06 crc kubenswrapper[4939]: I0318 18:30:06.152438 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3a9e8d-1669-476a-9372-3575f9bfa4d7" path="/var/lib/kubelet/pods/ba3a9e8d-1669-476a-9372-3575f9bfa4d7/volumes" Mar 18 18:30:06 crc kubenswrapper[4939]: I0318 18:30:06.631542 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:06 crc kubenswrapper[4939]: I0318 18:30:06.731162 4939 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvvmk\" (UniqueName: \"kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk\") pod \"f71d344d-2b65-4b38-933d-6b1be71046a7\" (UID: \"f71d344d-2b65-4b38-933d-6b1be71046a7\") " Mar 18 18:30:06 crc kubenswrapper[4939]: I0318 18:30:06.737749 4939 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk" (OuterVolumeSpecName: "kube-api-access-qvvmk") pod "f71d344d-2b65-4b38-933d-6b1be71046a7" (UID: "f71d344d-2b65-4b38-933d-6b1be71046a7"). InnerVolumeSpecName "kube-api-access-qvvmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:06 crc kubenswrapper[4939]: I0318 18:30:06.833784 4939 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvvmk\" (UniqueName: \"kubernetes.io/projected/f71d344d-2b65-4b38-933d-6b1be71046a7-kube-api-access-qvvmk\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:07 crc kubenswrapper[4939]: I0318 18:30:07.286426 4939 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" event={"ID":"f71d344d-2b65-4b38-933d-6b1be71046a7","Type":"ContainerDied","Data":"419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96"} Mar 18 18:30:07 crc kubenswrapper[4939]: I0318 18:30:07.286471 4939 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419732277b3917a1252c9972f5f2bb4e0b78015b19681da894ccd148c75f0a96" Mar 18 18:30:07 crc kubenswrapper[4939]: I0318 18:30:07.286564 4939 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-v2qpj" Mar 18 18:30:07 crc kubenswrapper[4939]: I0318 18:30:07.712533 4939 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-h25tv"] Mar 18 18:30:07 crc kubenswrapper[4939]: I0318 18:30:07.737624 4939 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-h25tv"] Mar 18 18:30:08 crc kubenswrapper[4939]: I0318 18:30:08.150567 4939 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a62dda8-f284-4d5b-a36d-d8634b7912eb" path="/var/lib/kubelet/pods/9a62dda8-f284-4d5b-a36d-d8634b7912eb/volumes" Mar 18 18:30:15 crc kubenswrapper[4939]: I0318 18:30:15.133384 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:30:15 crc kubenswrapper[4939]: E0318 18:30:15.134095 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:30:17 crc kubenswrapper[4939]: I0318 18:30:17.280484 4939 scope.go:117] "RemoveContainer" containerID="1274e5e75619c53ee883310133a6f9c62f31eee55715afe1e2486ac7b8766915" Mar 18 18:30:17 crc kubenswrapper[4939]: I0318 18:30:17.299819 4939 scope.go:117] "RemoveContainer" containerID="569ee6dad99e5177c152287fcd3b04c014e629a70cb0aeef4173afc1b6a5adf9" Mar 18 18:30:29 crc kubenswrapper[4939]: I0318 18:30:29.133020 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:30:29 crc kubenswrapper[4939]: E0318 18:30:29.133859 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:30:43 crc kubenswrapper[4939]: I0318 18:30:43.133565 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:30:43 crc kubenswrapper[4939]: E0318 18:30:43.134678 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5" Mar 18 18:30:56 crc kubenswrapper[4939]: I0318 18:30:56.144460 4939 scope.go:117] "RemoveContainer" containerID="6d4230beb85de1edff5b52b770d986fea09ce985a36c33489cc256d8fa00e0d8" Mar 18 18:30:56 crc kubenswrapper[4939]: E0318 18:30:56.146007 4939 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6q7lf_openshift-machine-config-operator(a32d41a6-8ebb-4871-b660-91407cbaa5c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6q7lf" podUID="a32d41a6-8ebb-4871-b660-91407cbaa5c5"